AI in Computer Science: A Primer

Download Full-Text PDF Cite this Publication

Text Only Version

AI in Computer Science: A Primer

Matthew N. O. Sadiku1, Philip O. Adebo1, Sarhan M. Musa1

1Roy G. Perry College of Engineering Prairie View A&M University

Prairie View, TX 77446

Abayomi Ajayi-Majebi2

2Department of Manufacturing Engineering Central State University

P.O. Box 1004 Wilberforce, OH 45384-1004

Abstract:- Artificial intelligence (AI) is a branch of computer science that deals with helping machines find solutions to complex problems in a more human-like fashion. AI involves the development of computer systems that can perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages. It has been changing and improving the field of computer science through more advanced programming techniques. This paper examines the current and future roles of AI in computer science.

Key Words: Artificial intelligence, artificial intelligence in computer science

INTRODUCTION

Computer science is the study of computers and computational systems. It covers the theory, design, development, and application of computer software and systems [1]. Artificial intelligence (AI) is a growing branch of computer science. Essentially, it involves borrowing characteristics from human intelligence, and applying them as algorithms in a computer friendly way. AI is the technology that is concerned with creating machine intelligence able to perform tasks heretofore only performed by human beings. Intelligence is usually regarded as the ability to collect knowledge and use it to solve complex problems. The field of artificial intelligence studies how to make machines have human thinking ability such as listening, speaking, reading, writing, thinking, and learning [2,3]. As shown in Figure 1, AI is a science and technology based on disciplines such as Computer Science, Biology, Psychology, Linguistics, Mathematics, and Engineering [4].

AI has been changing rapidly and expanding in application. Artificial intelligence has been successfully used to solve problems in diverse applications such as simulated pilots, doctor advisory system, search engines, computer games, adaptive user interfaces, personalized assistants, natural-language comprehension, and language translation. AI has played a major in computer science teaching and research since its inception, and it will keep forging its future development.

OVERVIEW ON ARTIFICIAL INTELLIGENCE

The term artificial intelligence (AI) was first used at a Dartmouth College conference in 1956. AI is now one of the most important global issues of the 21st century. AI is the branch of computer science that deals with designing intelligent computer systems that mimic human intelligence, e.g. visual perception, speech recognition, decision-making, and language translation. The ability of machines to process natural language, to learn, to plan makes it possible for new tasks to be performed by intelligent systems. The main purpose of AI is to mimic the cognitive function of human beings and perform activities that would typically be performed by a human being. Without being taught by humans, machines use their own experience to solve a problem.

AI is stand-alone independent electronic entity that functions much like human expert. Today, AI is integrated into our daily lives in several forms, such as personal assistants, automated mass transportation, aviation, computer gaming, facial recognition at passport control, voice recognition on virtual assistants, driverless cars, companion robots, etc. AI is not a single technology but a range of computational models and algorithms.

Some forms of AI that are most commonly used in different applications include the following [5,6]:

  • Expert systems: They solve problems with an inference engine that draws from a knowledge base equipped with information about a specialized domain, mainly in the form of if-then rules. Expert systems are the earliest, most extensive, the most active and most fruitful area.

  • Fuzzy logic: This makes it possible to create rules for how machines respond to inputs that account for a continuum of possible conditions, rather than straightforward binary.

  • Neural networks: These are specific types of machine learning systems that consist of artificial synapses designed to imitate the structure and function of brains. They are similar to the human brain. They are made up of artificial neurons, take in multiple inputs, and produce a single output. The network observes and learns as the synapses transmit data to one another, processing information as it passes through multiple layers.

  • Machine learning: This includes a broad range of algorithms and statistical models that make it possible for systems to find patterns, draw inferences, and learn to perform tasks without specific instructions. Machine learning is a process that involves the application of AI to automatically perform a specific task without explicitly

programming it. ML techniques may result in data insights that increase production efficiency. Today, artificial intelligence is narrow and mainly based on machine learning.

  • Deep learning: This is a form of machine learning based on artificial neural networks. Deep learning architectures are able to process hierarchies of increasingly abstract features, making them especially useful for purposes like speech and image recognition and natural language processing. Deep learning networks can deal with complex non-linear problems.

  • Natural Language Processors: For AI to be useful to us humans, it needs to be able to communicate with us in our language. Computer programs can translate or interpret language as it is spoken by normal people.

  • Robots: These are computer-based programmable machines that have physical manipulators and sensors. Sensors can monitor temperature, humidity, pressure, time, record data, and make critical decisions in some cases. Robots have moved from science fiction to your local hospital. In jobs with repetitive and monotonous functions they might even completely replace humans. Robotics and autonomous systems are regarded as the fourth industrial revolution. Robot police with facial recognition technology have started to patrol the streets in China.

    These AI tools are illustrated in Figure 2 [7]. Each AI tool has its own advantages. Using a combination of these models, rather than a single model, is recommended. AI systems are designed to make decisions using real-time data. They have the ability to learn and adapt as they make decisions. Figure 3 illustrates the relationship between computer science (CS), artificial intelligence (AI), and machine learning (ML) [8].

    APPLICATIONS OF AI IN COMPUTER SCIENCE

    Artificial intelligence has brought several changes to our modern society, such as smartphones, face recognition, search engines, machine translation, autonomous driving, and smart medical treatment. As technology improves, there will be many new applications of artificial intelligence. Applications of artificial intelligence in computer science include the following [9, 10].

  • Robotics: AI-enable robotics is an emerging and fast-developing technology. Robots often have shared sets of programming that allow them to function and communicate.

  • Self-Modifying Coding: AI is now being put into programming languages to create self-modifying groups of code. AI programming languages include LISP, PROLOG, and object-oriented languages.

  • Speech And Language Processing: It is becomingcommon for computers to be able to speak human in order to give and receive orders.

  • Data Mining: Data is mined or sorted and analyzed to find certain patterns, anomalies, or other values within extremely large volumes of information.

  • Visualizations: Computer programs can now make visualizations, and artificial intelligence will greatly enhance this process.

  • Marketing Programs: These are appealing for businesses who do not want to have to invest large sums of money into building marketing.

  • Image Recognition: The ability of a program to remember and decode an image is appealing and has many applications.

  • Cloud Computing: The ability to store and access data in the cloud is revolutionizing how people can access information from many locations. AI is going to help make this process more organized and systematic in the future.

  • Game Playing: Game playing and theorem proving are two of the earlier attempts at getting computers to think intelligently. Popular games include Chess, Go, Kalah, and Checkers. There is some AI in some of these games.

  • Computer Science Education: Every branch of the educational sector is significantly affected by AI. AI is incorporated into the Computer Science curriculum. It plays an important role in the instruction of Computer Science courses. One can observe recent advances of virtual reality, augmented reality, and artificial intelligence and their applications to educational process.

BENEFITS AND CHALLENGES

Artificial intelligence has achieved world-renowned results in many fields such as knowledge processing, pattern recognition, natural language processing, games, automatic theorem proofs, automatic programming, expert systems, knowledge bases, intelligent robots, data mining, and artificial neural networks. Artificial intelligence has the advantages over the natural or human intelligence as it is more permanent, consistent, less expensive, easier to duplicate and disseminate, and can perform certain tasks much faster and better than humans.

In many areas of computer science, artificial intelligence (AI) is a challenging and creative area. Software for complex systems, speech recognition, and neural networks present challenges in machine learning in reasoning under uncertainty. One of the challenges while teaching students is that everyone has a different pace of learning and understanding of instructions.

CONCLUSION

Artificial intelligence is an emerging science used in simulating human intelligence. It plays a significant role in both the research and teaching of computer science. It has permeated our daily work and life. It has become an indispensable technology for us now and a strategic technology for the future. As new areas in computer science, engineering, medicine, etc. are

discovered the application of AI correspondingly grows. Most companies are still exploring the opportunities for integrating AI into different devices. Such companies always employ computer scientists to assist in different areas of AI. In the future, intelligent machines, powered by AI, will replace or enhance human capabilities.

REFERENCES

  1. A. Pathak, Is artificial intelligence a part of computer science? October 2018, https://www.technotification.com/2018/10/is-ai-part-of-computer- science.html

  2. M. N. O. Sadiku, "Artificial intelligence", IEEE Potentials, May 1989, pp. 35-39.

  3. H. Li and H. Wang, Research on the application of artificial intelligence in education, The 15th International Conference on Computer Science & Education, 2020.

  4. Artificial intelligence tutorial, https://www.javatpoint.com/artificial-intelligence-tutorial

  5. M. N. O. Sadiku, Y. Zhou, and S. M. Musa, Natural language processing in healthcare, International Journal of Advanced Research in Computer Science and Software Engineering, vol. 8, no. 5, May 2018, pp. 39-42.

  6. Applications of AI and machine learning in electrical and computer engineering, July, 2020, https://online.egr.msu.edu/articles/ai-machine- learning-electrical-computer-engineering- applications/#:~:text=Machine%20learning%20and%20electrical%20engineering,can%20%E2%80%9Csee%E2%80%9D%20the%20environment

  7. https://www.pinterest.com/pin/A5-3PwEQgKcFup8V3BepN0Q/

  8. R. Reynoso, Understanding artificial intelligence and machine learning, June 2019, https://learn.g2.com/artificial-intelligence-and-machine- learning

  9. 8 Applications of artificial intelligence in computer science,https://www.businessworldit.com/ai/applications-of-artificial-intelligence/

  10. N. Pillay, Artificial intelligence in computer science teaching and research, https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.469.5198&rep=rep1&type=pdf

ABOUT THE AUTHORS

Matthew N.O. Sadiku is a professor emeritus in the Department of Electrical and Computer Engineering at Prairie View A&M University, Prairie View, Texas. He is the author of several books and papers. His areas of research interest include computational electromagnetics and computer networks. He is a fellow of IEEE.

Philip O. Adebo is an instructor at Texas Southern University. He completed his PhD in Electrical and Computer Engineering, Prairie View A&M University with emphasis on power systems. His research interests include power systems, renewable energy, microgrids, smart-grid systems, restructuring power system and optimization of power systems.

Abayomi Ajayi-Majebi is a professor in the Department of Manufacturing Engineering at Central State University in Wilberforce, Ohio. In 2015 he was honored by the White House as a Champion of Change for his significant contributions to the engineering education of minority students. He is a senior member of both the Society of Manufacturing Engineers and the American Society for Quality.

Sarhan M. Musa is a professor in the Department of Electrical Engineering at Prairie View A&M University, Texas. He has been the director of Prairie View Networking Academy, Texas, since 2004. He is an LTD Sprint and Boeing Welliver Fellow. His research interests include computer networks and computational electromagnetics.

Figure 1 Artificial intelligence comprises of fields such as computer science, neuron science, psychology, mathematics, engineering, etc. [4].

Figure 2 AI tools [7].

Figure 3 Relationship between computer science (CS), artificial intelligence AI), and machine learning ML) [8].

Leave a Reply

Your email address will not be published. Required fields are marked *