top of page
Writer's pictureSofia Fall

Learning to Think Like a Computer to Solve Problems as a Human

Sofia Fall, ‘25

September 29th, 2024


Years ago, during my second-to-last year of middle school, I was struggling with math. In the midst of the COVID pandemic, my remote Zoom classes weren’t holding my attention. My father decided to take me back to the basics, often quizzing me on relatively simple concepts. One evening, he asked: “Why is any number raised to the power of zero equal to one?” The answer seemed straightforward—because that’s the rule—but it lt like a trick question. I had always accepted it as one of those mathematical facts you just "know" without ever questioning why. Feeling a bit flustered, I gave him a textbook explanation about exponents and how multiplying by zero always results in zero. But I could tell from his expression that my answer hadn’t quite landed.


After thinking about it for a while, I realized that, just like with other math concepts, the rule wasn’t arbitrary—it had a logical foundation that made perfect sense once you saw it clearly. When you think of exponents as a way to repeatedly multiply a number by itself, it becomes obvious why any number raised to the power of 1 is simply itself. If you start with, say, 2³ (which equals 8) and then reduce the exponent by one each time, the result divides by 2: 2² is 4, 2¹ is 2, and so on. For zero, the answer is no different.


That method of breaking down every problem I encountered followed me into high school, where I became interested in computer vision—essentially, how to program a computer to interpret and “understand” the data it receives from a visual sensor. Around this time, I began practicing computational thinking (CT for short): conceptualizing problems in ways that allow a computer to assist in solving them. Consider how you approach mathematical word problems: you learn to translate the words into mathematical equations, solve them, and then interpret your answer within the original context. Computational thinking works similarly, but it extends beyond mathematics. It enables us to tackle complex, real-world challenges in fields like science, language, and history—essentially, in any subject imaginable.


Image of me writing this article, doodled by me, 2024.

Let's be honest, tech is already integrated into all sectors of the workforce—from medicine to agriculture to education—knowing how to think computationally is becoming as essential as literacy or numeracy.


This skill set is becoming recognized as vital to the modern workforce, equipping individuals not just to use technology but to shape and influence the future of innovation.








A Simplified History of Computers and Computational Thinking

The foundation of modern computing was laid by an English mathematician named Alan Turing.


(Photograph of Alan Turing, Britannica)

Turing wasn’t out to create a groundbreaking gadget or launch an industry. Instead, in 1936, he was grappling with a complex mathematical problem posed by the German mathematician David Hilbert in 1928. Understanding Hilbert’s challenge not only sheds light on the origins of computers but also hints at their future potential.

Hilbert was deeply interested in the limits of human knowledge in mathematics. He posed the Entscheidungsproblem (German for "decision problem): was there a method or algorithm that could determine whether any given mathematical statement was provable?


Hilbert believed that if a universal algorithm existed, all of mathematics could be solved. At the time, algorithms were associated with manual tasks like multiplication, but Turing redefined this concept by creating the Turing machine, a theoretical device capable of executing any algorithm. His revolutionary idea laid the groundwork for modern computers. To prove its potential, Turing analyzed how mathematicians solve problems and showed how his machine could replicate those steps. Though his reasoning was complex, Turing’s work paved the way for the field of computer science as we know it today (Britannica, n.d., Alan Turing). Meanwhile, Hilbert's challenge became part of a larger legacy in mathematics, with his influence extending far beyond this singular problem.


(Photograph of David Hilbert, Britannica)

Born in 1862, he made significant contributions to the formalistic foundations of mathematics, reducing geometry to a series of axioms and laying the groundwork for modern mathematical logic. His work on integral equations in 1909 sparked research in functional analysis, influencing the trajectory of 20th-century mathematics (Britannica, n.d., David Hilbert).


Fast forward to 1966, when Alan Perlis, a key figure in early computing, became the first recipient of the prestigious A.M. Turing Award,.

(Photograph of Alan Perlis, Britannica)

A staunch advocate for the significance of algorithmic thinking in problem-solving, Perlis promoted the concept of "procedural epistemology." This idea encourages us to view the world through the lens of algorithms and processes, rather than relying solely on static knowledge. He famously remarked, “To understand a program, you must become both the machine and the program.” This perspective extends beyond computer science; it resonates with anyone engaged in problem-solving, regardless of their field.


One of Perlis's more significant contributions was his focus on simplifying complex issues into smaller, more manageable components. He understood that by addressing problems in a systematic manner, we can create solutions that are both effective and sustainable. Perlis's work focused on comprehending processes—breaking down problems into a sequence of steps that could be systematically executed by a machine. However, he's also known for his advocacy for "procedural epistemology," a way of understanding the world through algorithms and processes rather than through static knowledge. Perlis championed the idea that learning to think algorithmically was essential not only for computer scientists but for anyone engaged in problem-solving across disciplines (Britannica, n.d., Alan Perlis).

(Photograph of Donal Knuth , Wikipedia)

Born on January 10, 1938, in Milwaukee, Wisconsin, Donald Ervin Knuth, is best known for his authoritative multivolume series, The Art of Computer Programming. This seminal work advanced the understanding of algorithm design by emphasizing the formalization and analysis of algorithms. His contributions have established a rigorous framework for evaluating algorithmic efficiency, notably through the introduction of Big-O notation. This fundamental concept enables computer scientists to assess how the execution time of an algorithm scales with varying input sizes.


Knuth's research encompasses essential topics such as data structures, sorting, searching algorithms, and efficient algorithmic design. These foundational principles form the bedrock of modern computational thinking. His influence extends to various domains of computer science, including cloud computing, artificial intelligence, and cryptography (Britannica, n.d., Donald Knuth).


(Photograph of Seymour Papert, Britannica)

Seymour Papert, who later coined the term "computational thinking,"

significantly expanded the application of these ideas from technical fields into education. As a student of Jean Piaget, Papert was a pioneer in utilizing computers as educational tools. He developed the Logo programming language specifically to introduce children to coding, with the broader objective of fostering new cognitive frameworks.


Papert recognized that computers could serve a purpose beyond mere calculation; they had the potential to transform how students approached problem-solving. His vision was ambitious: he believed that computational thinking could democratize learning, empowering young people to engage in logical, creative, and systematic thinking at a deeper level. His contributions laid the groundwork for the future integration of computational thinking into educational practices, fundamentally reshaping how knowledge is imparted in classrooms (Britannica, n.d., Seymour Papert).


Wing’s 2006 Paper and the Mainstreaming of Computational Thinking


(Photograph of Jeannette M. Wing, Columbia University)

While the foundational ideas for computational thinking were being shaped for decades, it wasn’t until Jeannette Wing’s influential 2006 paper that CT became a mainstream concept. Wing argued that computational thinking should be regarded as a fundamental skill for everyone, not just computer scientists. She outlined the cognitive processes involved in computational thinking: decomposition,

pattern recognition, abstraction, and algorithmic design. In Wing's vision, these methods are applicable far beyond computer science, influencing how we approach decision-making and problem-solving in everyday life.


Wing emphasized that computational thinking involves not only solving problems and designing systems but also understanding human behavior and made the case that incorporating this new skill into every child's analytical repertoire is essential to their understanding of science, technology, engineering, and math (STEM). Since Wing’s 2006 publication, computational thinking has been incorporated into curricula worldwide. Countries like the United Kingdom have integrated it into their national curricula, teaching students from primary school onward how to code, solve algorithmic problems, and think abstractly.


The “Three As” of Computational Thinking: Abstraction, Automation, and Analysis


Computational thinking can be broken down into three key processes: Abstraction, Automation, and Analysis—often referred to as the "Three As." These steps form a roadmap for solving complex problems, whether you're designing a climate model or organizing your daily tasks.


Abstraction is all about simplifying. You focus on the most important parts of a problem while setting aside the details that don’t matter as much. Imagine you’re building a climate model—your main concerns might be carbon emissions and global temperatures. You wouldn’t get lost in irrelevant data like the color of the skies on specific days.


Next is Automation, where you turn your solution into something a computer can run. Usually, this involves writing an algorithm or a program. In our climate model example, this would mean creating a set of instructions for the computer to simulate future climate scenarios based on the key factors you’ve identified.


Finally, there’s Analysis. This is where you test how well your solution works. You might run simulations, tweak some parameters, or adjust the algorithm to make it more accurate. For the climate model, you would check if it reliably predicts temperature changes under different conditions.


Image source: “Computational thinking,” Wikipedia

The beauty of the "Three As" is their flexibility. Whether you’re a scientist, teacher, or working in another field, this method gives you a structured approach to tackle problems step by step.



Word Problems and Computational Thinking


Now that you understand the steps, how can we apply them? When solving a word problem in school, the process generally involves reading the problem, extracting key details, converting them into mathematical expressions, solving, and applying the solution back to the context. Computational thinking follows a similar approach. For example, if you're developing a recommendation system for a streaming service, you'd begin by breaking the task into smaller parts (decomposition), like identifying user preferences. Then, you'd recognize patterns in users' viewing habits (pattern recognition), focus on essential features like genre or viewing history (abstraction), and design an algorithm to automate recommendations (algorithmic design).


Please indulge me by trying another example: building an anime recommendation system. You'd start with decomposition by breaking down the problem into parts, such as identifying user preferences, classifying genres, and selecting key features like character development, storyline, or art style. Pattern recognition would help you identify commonalities among anime that certain users enjoy, such as shared themes or animation styles. Abstraction would then allow you to focus on the most relevant data, like a user's favorite genre or character type. Finally, you'd use algorithm design to create a recommendation engine that draws on the identified patterns and abstractions to suggest shows based on user preferences. Easy enough, right?


Caution: For the purpose of this section, the other participant’s name has been anonymized as “Leo”.


Our Goals

Image source: Australian Curriculum, Assessment and Reporting Authority [ACARA], n.d.

Given the time constraints, we both chose simple goals. I aimed to create an algorithm—a series of ordered steps taken to solve a problem—and focus on evaluation, which involves determining the effectiveness of a solution and generalizing it to apply to new problems. Meanwhile, Leo wanted to explore pattern recognition, analyzing the data to look for patterns that make sense of the information, as well as decomposition, which involves breaking problems into smaller parts.


In ‘Why Don’t Students Like School?’, psychologist Daniel Willingham points out that, “Sometimes I think that we, as teachers, are so eager to get to the answers that we do not devote sufficient time to developing the question.” A fundamental element of exceptional writing is the skill to cultivate a question, letting the intricacies of it linger in the reader's mind before revealing the answer. Imagine if I could master that technique. Could I effectively develop a formula for crafting compelling narratives?


Leo’s goal was to explore how fireflies synchronize their flashing patterns using the computational thinking framework. She explained, “Computational thinking in biology is all about using algorithms and computational methods to investigate natural phenomena like firefly synchronization. It’s exciting because it takes us beyond just observing the behavior—we can analyze the underlying patterns and processes. Instead of simply memorizing facts, we’re digging into the ‘why’ and ‘how’ behind these biological events, breaking down complex systems and discovering new insights that traditional methods might overlook.”


What We Did


Earlier, I touched on Wing's (2006) definition of computational thinking (CT) as a method for "solving problems, designing systems, and understanding human behavior by drawing on the concepts of computer science." Although her original framework is frequently regarded as foundational in the subject, it primarily focuses on computer science and has a strong technical bent. She emphasizes, for instance, that “even at early grades, we can viscerally show the difference between a polynomial-time algorithm and an exponential-time one” (2008, p. 3721). Wing asserts that computational thinking is "everywhere" and "for everyone," highlighting its applicability across different disciplines.


In 2016, the Computer Science Teachers Association (CSTA) revised its definition of computational thinking, describing it as a methodology for problem-solving that extends beyond computer science. This approach allows for analyzing and developing solutions through computational methods (CSTA, 2016). Computational thinking can also be viewed as the knowledge, skills, and attitudes necessary to leverage computers for solving real-world problems productively (Özden, 2015). Our intention was to utilize CT for a single purpose: structuring the process of working through an inquiry.


To do so, we adapted the core elements of computational thinking for our specific inquiries. In my case, the inquiry focused on developing an algorithmic approach to narrative structure. Leo, on the other hand, explored the synchronization behavior of fireflies. The following tables illustrate how we tailored computational thinking elements to each inquiry:

Table 1. Adaptation and sequencing of selected computational thinking elements for the purposes of self instruction. Selections drawn from Grover & Pea (2013).
Table 2. Adaptation and sequencing of selected computational thinking elements for the purposes of self instruction. Selections drawn from Grover & Pea (2013).

What We Learned

Leo: Fireflies DO NOT actually synchronize their flashing. These insects aren’t trying to sync up on purpose—each firefly has its own internal clock and flashes when the time is right for them. For almost a hundred years, scientists couldn’t figure out how this worked, until a breakthrough study in 1992 showed that fireflies don’t follow a leader. Instead, they react to the flashes of other nearby fireflies, adjusting their timing slightly. This small shift spreads through the group, creating a ripple effect that eventually leads to the whole swarm flashing together in a wave of light.


Me: In a similar way, writing is also about making decisions and solving problems. Writing is an ongoing process of planning, coming up with ideas, organizing, and revising. They showed that writing is like problem-solving, with writers constantly adjusting and making choices. This is where computational thinking (CT) and writing overlap. When it comes to multilingual learners, where English isn’t the first language, writing becomes even more complex with grammar, syntax, and getting the main point across. Writing is a key Higher Order Thinking Skill (HOTS), demanding critical analysis and the ability to organize thoughts in a clear way.


Bringing together writing, computational thinking, and HOTS can really improve the research process. Writing helps to clarify and structure ideas, while computational thinking strengthens problem-solving and analysis. These skills—critical thinking, problem-solving, and creativity—are essential for solid research (Huang et al., 2022; Phakiti, 2018).

Inspired by the principles of the Feynman Technique, I developed an algorithm to tackle the intricate demands of writing a narrative essay. The first step in my approach is to clearly outline the central theme or story, which often begins with brainstorming the main idea and identifying key events that will drive the narrative. By breaking down the narrative into its essential components—characters, setting, conflict, and resolution—I create a structured framework for my essay. This way, I can focus my efforts effectively and avoid getting sidetracked by details that don’t contribute to the overall story. Each section becomes a series of bullet points, representing specific scenes or character developments that help streamline my writing process.


Next, I try to understand the audience's expectations. Instead of aiming for overly complex language or ornate prose, my goal is to engage my readers by crafting vivid imagery and relatable characters. I structure my narrative with clear transitions and purposeful pacing, ensuring that the flow keeps the reader's interest alive. I also remind myself to weave in personal reflections or thematic insights, so my story resonates on a deeper level. This preparation leads to a draft that’s not just a collection of events but a carefully constructed narrative that reflects my voice and intention. Regular feedback from peers and strategic revisions are crucial to refining my work, ensuring it captures the essence of the story while meeting the standards expected in narrative writing.


The only conclusion I have is that I found this method to be highly effective, and Leo shared the same enthusiasm for its practical application. I encourage everyone to give it a try!



Works Cited



Britannica. (n.d.). Alan Turing. https://www.britannica.com/biography/Alan-Turing


Britannica. (n.d.). David Hilbert. https://www.britannica.com/biography/David-Hilbert


Britannica. (n.d.). Alan Jay Perlis. https://www.britannica.com/biography/Alan-Jay-Perlis


Britannica. (n.d.). Donald Knuth. https://www.britannica.com/biography/Donald-Knuth


Britannica. (n.d.). Seymour Papert. https://www.britannica.com/biography/Seymour-Papert


Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33–35. https://doi.org/10.1145/1118178.1118215


Wing, J. M. (2008). Computational thinking and thinking about computing. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 366(1881), 3717–3725. https://doi.org/10.1098/rsta.2008.0118


“Computational thinking.” (n.d.). Wikipedia. Retrieved August 28, 2024 from https://en.wikipedia.org/wiki/Computational_thinking


Australian Curriculum, Assessment and Reporting Authority (ACARA). (n.d.). [Title of the image]. Retrieved from https://www.australiancurriculum.edu.au/


Willingham, D. T. (2009). Why don't students like school? (1st ed.). Jossey-Bass.


Özden, M. Y. (2015). Computational thinking. Retrieved from http://myozden.blogspot.com.tr/2015/06/computational-thinking-bilgisayarca.html

Huang, Y. M., & Soman, D. (2022). Applying a business simulation game in a flipped classroom to enhance engagement, learning achievement, and higher-order thinking skills. Computers & Education, 177, 104378.


Australian Curriculum, Assessment and Reporting Authority (ACARA). (n.d.). Computational thinking in practice: Parent-teacher cards. https://www.australiancurriculum.edu.au/media/5908/computational-thinking-in-practice-parent-teacher-cards.pdf


Grover, S., & Pea, R. D. (2017). Computational thinking: A competency whose time has come. In Computer science education (Chap. 3). https://doi.org/10.5040/9781350057142.ch-003



32 views0 comments

Comentários


bottom of page