Artificial Intelligence in Tech Industry | HCLTech

Artificial Intelligence in Tech Industry

 
April 08, 2019
Sandip Bhattacharya

Author

Sandip Bhattacharya
ASSOCIATE VICE PRESIDENT SAP Practice
April 08, 2019
Share

Professor Max Tegmark, in his book Life 3.0, defines 'Life 3.0' as one which is free from evolutionary shackles. All life forms are limited by their biological hardware (essentially carbon and stardust). This is overcome by imbibing non-carbon (silicon?) intelligence into human life. Basically, intelligence that does not have biological limitations, such as artificial intelligence (AI), wherein, lifeless objects develop capabilities to learn and react based on that learning. Things like a car, a computer, interactive movie, or something that we are clueless about today.

There are many concepts within the domain of AI technology, – machine learning (ML), deep learning, neural networks, natural learning processing (NLP), VR , Augmented Reality (AR), LSTM, etc.

There are many concepts within the domain of AI, including machine learning (ML), deep learning, neural networks, natural learning processing (NLP), VI , Augmented Reality (AR), LSTM, and a few others. All of these deal with a vast amount of data in various shapes and forms - sound, image, video, electrical pulses, temperature, pH values, viscosity, biome markers, structured and unstructured data, and so on. Various techniques (like modeling) are used to process such data or a combination thereof using oversimplified methods, such as learning and scoring.

What are the application or use cases of AI in the technology world– “Tech industry” for heaven’s sake! Somehow, tech industry has a very squishy definition and can include the semiconductor industry, digital content provider, digital markets, F1 industry, healthcare, etc. Since I need to get this blog to press, I would like to limit to a few use cases and ideas.

Chip design - Semiconductor chip design is challenging molecular limitations of silicon. Design process consists of assembling some really large volumes of data, correlating those to simulating scenarios (voltage, heat, power consumption, memory drops, etc.). Traditional process is focused on designing the nominal and build variations around it, and create the best technological product. Over the years, that has changed to include technology, integration capabilities, collaboration, and the window to ship the product (launch and supply chain ). While Big Data platforms are helping gather the data, it is AI that is driving learning specific algorithms and computational optimization of the data.

Fabs – this business has seen some serious churn. Around 83 companies have closed (or consolidated) since 2008. It is a very capital-intensive industry and every new node generation-lead time is getting longer, transistor and memory is getting denser, and layers and dimensions are increasing. AI technology is used in increasing chip yields (dies per wafer), faster inspection, and improved quality assurance. Typical tooling and processes can only result in low throughput, the high cost of tooling, and, thus, the risk of damaging the product. Companies are now using AI with 3D microscopes that can process 100,000 chips per minute and catch errors (material density in wafers) at the nanometer scale and for multiple layers instantly. Testing and assembly constitutes 20-30% of production cost, which no one can afford to ignore. One of the large memory and dynamic random-access memory (DRAM) companies uses deep learning to predict failures before electrical testing starts.

Maintenance – High capital cost means that all assets must be up and running at the peak and you need to know before something breaks down. This is complex – there is a huge amount of data from sensors and various other condition monitoring systems. Artificial intelligence algorithms using all these data elements are capable of differentiating white noise and of recognizing errors, or minor small changes that may blow up soon. These algorithms can predict slowdowns/breakdowns and, therefore, guide operational decisions.

Supply Chains – The age-old problem of matching supply with demand still exists. Enterprises are constantly trying to forecast accurately and optimize replenishment strategies. Structural changes (outsourcing, value-driven supply chains, etc.) and process optimization can achieve only so much – some even believe that supply chains are already process optimized. Leaders have taken an inside out approach centered around internal capacity (NPI, SKU mix, distribution networks, promotions, etc.), peppered with a few external parameters or patterns. It is not about thinking outside the box, instead, it is about increasing the size of the box. Organizations have to deal with several external factors, such as market perceptions, competitive landscape, geopolitical changes, regulatory implications, weather, and sheer volume of IoT data coming from sensors. COOs are experimenting with solutions using artificial intelligence for predictive forecasting and replenishment using data that was previously not available or was impossible to deal with, given sheer volume, diversity, and sources.

Robotics – Today’s robots (most of them) are rule-based and are designed to perform repetitive tasks efficiently. These cannot react to changes and are built for a purpose. Should there be a change in the process, environment, or product, you need to retrofit or replace. Deep learning enables object recognition and semantic segmentation capabilities - ability to recognize object properties and the context around the robots. Thus, these robots can collaborate with humans and learn progressively, teach other robots (scary, eh!), and are flexible enough to adapt. A flash memory company has deployed these free-roaming robots that talk to each other and jump into action to increase throughput. These machines do not collide with other objects, humans, or congest the shop floor – and these can change their own batteries.

We need a ton of data to infer – that’s a bit too inefficient. Let’s twist the question – do we at all need so much data for every prediction or prescription. The answer may lie in edge computing and LSTM (Long Short Term Memory). Edge computing puts low-level compute power in front of end devices, rather than sending all the data to a centralized hub. Instead of sending terabytes of data generated by an F1 car to a central server, each device (like tire pressure sensor) has small compute capacity and sends only relevant data to the next sensor or the central server. LSTM, on the other hand, handles large chunks of data by breaking into smaller relevant pieces with the ability to forget and pass-on. Peanuts and butter versus peanut butter – LSTM can deduct context sensitive outcome using the forget gate. So, if you have a bread in your plate and a glass of milk, LSTM will infer that you are going to spread peanut butter on your bread. Watch a simple demo here:

https://youtu.be/mLxsbWAYIpw.

These techniques are used in autonomous cars, law enforcement, fraud detection (recall Zuckerberg’s Senate hearing), marketing campaigns, precision medicine, biome research, music production, personal assistants, etc.

AI is enabling facial recognition on your phone. Conversational AI is reducing the tactile footprint of human interaction, creating naturalness in ever increasing user surface area. Advancements in AI is transferring cognitive burden from user to the device. We are experiencing AI everyday in some shape or form.

Where do you start? In today’s world, if your assets are not digital, your business is on life support. Till the time assets are not digitized, do not waste your time on AI. The first step is to go digital.

One can argue that managing huge volumes of diverse data is a big issue in the tech industry as they have matured in terms of proliferation of data nodes (sensors) and collection of real-time data. Establishing a robust Big Data architecture and continuously improving the Big Data architecture capabilities is the next step before adopting AI. 5G is coming (brace yourself) with 5x lower latency, 100x greater speed, 1000x more end points for IoT, true machine-to-machine networks, and an explosion in machine/user and user/machine interactions. This would promote the agenda of Big Data architecture and edge computing capabilities.

Organizations must drive a culture of thinking big and inviting fresh talent. AI is not data warehousing or analytics – it is a very different and interesting breed of computational science. General purpose solutions from big box software companies are no match to pointed solutions from open-sourced garage geeks. Most of these initiatives will fail, however, each of these will make us smarter. There are no failures, only learning experiences. Leaders will have to deal with the challenge of directing scientific intellect for business outcomes.

Tech industry and AI are symbiotic by definition. Tech industry has taken up challenges to jump every physical boundary to move to the next orbit, mostly transformational in nature. AI is evidently a force multiplier in that regard.

Machines have been built since ages (first wheel) to help humans pursue higher goals. There are ethical questions and negligible consensus relating to privacy, individual freedom, dominance, and exploitation, etc. Professor Tegmark argues that the matter will become more and more intelligent, it is only up to us (carbon intelligence) to deal with. Life happened in a soup of chemicals (primarily amino acids) and others under some extraordinary physical conditions, if we do not address issues of ethics and higher purpose of AI – humanity may end-up in a very undesirable soup. He has evidences that humanity, in general, has been constructive – we messed up with fire, but eventually built fire alarms, fire extinguishers, fire exits, fire department engines to take care. I believe AI will make us more intelligent.

I am writing this while Northern California is dealing with forest fires that have cost lives, destroyed property, and displaced communities. Wish someday AI and robotics can be a part of the solution to avoid such calamities – we have to work hard.

Get HCLTech Insights and Updates delivered to your inbox

Tags:
Artificial Intelligence
Hi-Tech
Big Data and Analytics
Data modelling
Machine learning
Semi Conductors
Share On