Artificial Intelligence

The concept of Artificial Intelligence (AI) was first coined in 1955, by Stanford Professor John IcCarthy, as the science and engineering of making intelligent machines. In other words, it is a -anch of Computer Science, in which a machine mimics the cognitive functions that are associated ith the human mind – such as learning and problem solving. Some famous examples include highvel strategic gameplay (e.g. when Google’s Deep Mind beat the world’s best Go player) and atonomous cars. For machines to act and react like humans, they need access to sufficient formation from the real world. To be able to initiate common sense, reasoning and problem-solving wer, AI needs knowledge engineering – which provides data about objects, categories, properties

Application of Artificial Intelligence

Application of Artificial Intelligence and Data Science Al solutions in many application areas have now increased the acceptance and very high expectations associated with it. Al can be applied in any domain where it can be trained through the vast amounts of content, able to automatically adapt and learn from its mistakes and failures. Repetitive routine actions that requires Jarge Amount of data or knowledge have the highest potential to benefit from Al system. High adoption segments that are currently are getting exponentially accelerated include:

a) Medicine & Jealthca. Application of Artificial Intelligence re: Classifications for radiology and pathology, cancer cell detection, diabetic grading, drug Discovery.

b) Research in all areas of Science and Engineering : Scientific Data Analysis and Expert Systems

c) Customer Service: Robotic Concierge, Virtual Assistant, Product Recommendation.

d) Maintenance: Failure Prediction, Maintenance Planning, Anomaly Detection.

e) Operations: Resource Planning, Supply Chain & Logistics, Security & Fraud Detection.

Automated Devices: Connected Cars, Robotics, and Intelligent Video Analytics.

g) Banking, Financial Services, Insurance: With Management, Portfolio Planning, Risk Assessment, Underwriting.

h) Other Areas: Public Cloud, Internet, Image Classification, Speech Recognition, Language Translation, Sentiment Analysis, Question Answering, Recommendations. Al Deep Learning is paradigm shifting Technology driving the fourth industrial revolution. It will transform everything. India has a strategic opportunity to be Global Talent provider to the world for Artificial Intelligence. The major objective of development of this lab is to enable the Institute to offer the very best education, training and research facility in Artificial Intelligence powered by GPUs for Al with focus on deep learning, machine learning, data science and analytics. Technology powered lab will enable Academic Institutions to set up an Al Lab, start clectives, short terms courses and take up research efforts in Al; deep learning, machine learning, data science and analytics. The Artificial Intelligence Lab can also help to train developers, data scientists, and researchers to use deep learning andaccelerated computing to solve real-world problems across a wide range of domain access to GPU-accelerated workstations in the cloud, researchers and students will learn how to train, optimize, and deploy ncural networks using the latest deep learning tools, frameworks, and SDKs. They can also learn how to assess, parallelize, optimize, and deployGPU-accelerated computing applications. GPU computing leverages the parallel processing capabilities of GPUaccelerators and enabling software to deliver dramatic increases in performance for scientific, data analytics, engineering, consumer, and enterprise applications. > Many sound research facilities can be developed for bridging domain experts in various disciplines and their computing needs. There is an increasing trend of using graphics processing units (GPUs) for compute intensive tasks, as these processors have large number of independent cores in a single chip. This has led to the development of supercomputers built on GPUs. The programming of such multicore processors has importance in the current high performance computing domain.

For Al system to be intelligent and provide meaningful responses it needs to be trained on the appropriate data. Uniikemost conventional learning methods, which are considered using shallow structured Icarning architectures, deep learning refers to machine learning techniques that use supervised and/or unsupervised strategies to automatically learn hierarchical representations in deep architectures for classification. Decp Learning was constrained with two key factors for practical applicability. One was the availability of Big Data,explosion of Big Data with Internet growth solving the Data problem, the second issue was that even with Big Data availability, to get the compute power required to harvest valuable knowledge from Big Data. The large and rapidly growing body of information hidden in the unprecedented volumes of non-traditional data requires both the development of advanced technologies and interdisciplinary teams working in close collaboration. The success of an Al solution is not only dependent on the accuracy of its response. But also on the timetaken to (re)train, develop, and scale accordingly. A reduction in time required to train a model will fuel the growth of Alsolutions that are released into the market. Several options are available to help reduce the training time of a model. Of allthe techniques to solve this problem practically, using GPU based Deep Learning proved to be the only viable option for Deep Learning. The reason was the wide availability of GPUs that make parallel processing ever faster, cheaper, and more powerful. Compute parallelization is a vital ingredient in Deep Learning. Computing the weights and biases of network layers, iterating through network hyperparameters, forward propagation, error estimations through gradient descent andother calculations are all essentially parallelizable operations. The recognition of this fact by early researchers and NVIDIA is perhaps the single most impactful development in the advance of DL over the past half-decade. Applications of Artificial Intelligence in Science and Engineering Al in Healthcare: • The market value of Al in the health care industry is predicted to reach $6.6 billion by 2021. The impact of Al in the health care sector is genuinely life-changing. It is driving innovations in clinical operations, drug development, and surgery and data management. . Healthcare Industries are applying Al to make a better and faster diagnosis than humans. Al can help doctors with diagnoses and can inform when patients are worsening so that medical help can reach to the patient before hospitalization. Possibility of AI in healthcare is to develop a clinical decision support system for disease prevention that can give the physician a warning when there’s a patient at risk of having some disease. • Another possibility is designing a digitalized device that can find disease and recommend medicines for common diseases. Some of the revolutionizing applications of Al in heath care are explained as follows: AI in Data Security • The security of data is crucial for every company and cyber-attacks are growing very rapidly in the dinital world Al can hele to make your data more safe and secure.