Algorithm

Share This
« Back to Glossary Index

An algorithm is a well-defined sequence of instructions or rules that provides a solution to a specific problem or task. Originating from ancient civilizations, algorithms have evolved through centuries and are now integral to modern computing. They are designed using techniques such as divide-and-conquer and are evaluated for efficiency using measures like big O notation. Algorithms can be represented in various forms like pseudocode, flowcharts, or programming languages. They are executed by translating them into a language that computers can understand, with the speed of execution dependent on the instruction set used. Algorithms can be classified based on their implementation or design paradigm, and their efficiency can significantly impact processing time. Understanding and using algorithms effectively is crucial in fields like computer[2] science and artificial intelligence[1].

Terms definitions
1. artificial intelligence.
1 Artificial Intelligence (AI) refers to the field of computer science that aims to create systems capable of performing tasks that would normally require human intelligence. These tasks include reasoning, learning, planning, perception, and language understanding. AI draws from different fields including psychology, linguistics, philosophy, and neuroscience. The field is prominent in developing machine learning models and natural language processing systems. It also plays a significant role in creating virtual assistants and affective computing systems. AI applications extend across various sectors including healthcare, industry, government, and education. Despite its benefits, AI also raises ethical and societal concerns, necessitating regulatory policies. AI continues to evolve with advanced techniques such as deep learning and generative AI, offering new possibilities in various industries.
2 Artificial Intelligence, commonly known as AI, is a field of computer science dedicated to creating intelligent machines that perform tasks typically requiring human intellect. These tasks include problem-solving, recognizing speech, understanding natural language, and making decisions. AI is categorized into two types: narrow AI, which is designed to perform a specific task, like voice recognition, and general AI, which can perform any intellectual tasks a human being can do. It's a continuously evolving technology that draws from various fields including computer science, mathematics, psychology, linguistics, and neuroscience. The core concepts of AI include reasoning, knowledge representation, planning, natural language processing, and perception. AI has wide-ranging applications across numerous sectors, from healthcare and gaming to military and creativity, and its ethical considerations and challenges are pivotal to its development and implementation.
2. computer. A computer is a sophisticated device that manipulates data or information according to a set of instructions, known as programs. By design, computers can perform a wide range of tasks, from simple arithmetic calculations to complex data processing and analysis. They have evolved over the years, starting from primitive counting tools like abacus to modern digital machines. The heart of a computer is its central processing unit (CPU), which includes an arithmetic logic unit (ALU) for performing mathematical operations and registers for storing data. Computers also have memory units, like ROM and RAM, for storing information. Other components include input/output (I/O) devices that allow interaction with the machine and integrated circuits that enhance the computer's functionality. Key historical innovations, like the invention of the first programmable computer by Charles Babbage and the development of the first automatic electronic digital computer, the Atanasoff-Berry Computer (ABC), have greatly contributed to their evolution. Today, computers power the Internet, linking billions of users worldwide and have become an essential tool in almost every industry.
Algorithm (Wikipedia)

At mathematics and computer science, an algorithm (/ˈælɡərɪðəm/ ) is a finite sequence of rigorous instructions, typically used to solve a class of specific problems or to perform a computation. Algorithms are used as specifications for performing calculations and data processing. More advanced algorithms can use conditionals to divert the code execution through various routes (referred to as automated decision-making) and deduce valid inferences (referred to as automated reasoning), achieving automation eventually. Using human characteristics as descriptors of machines in metaphorical ways was already practiced by Alan Turing with terms such as "memory", "search" and "stimulus".

In a loop, subtract the larger number against the smaller number. Halt the loop when the subtraction will make a number negative. Assess two numbers whether one of them equal to zero or not. If yes, take the other number as the greatest common divisor. If no, put the two number in the subtraction loop again.
Flowchart of using successive subtractions to find the greatest common divisor of number r and s

In contrast, a heuristic is an approach to problem solving that may not be fully specified or may not guarantee correct or optimal results, especially in problem domains where there is no well-defined correct or optimal result. For example, social media recommender systems rely on heuristics in such a way that, although widely characterized as "algorithms" in 21st century popular media, cannot deliver correct results due to the nature of the problem.

As an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.

« Back to Glossary Index
en_USEN
Scroll to Top