Skip to main content Skip to main navigation Skip to search Skip to footer
Type to Search Subscribe View Tags

HCL Technologies

How AI can take over our free will

How AI can take over our free will
Ramya Paramasivam Gupta - Area Sales Director, HCL Europe | February 5, 2019
213 Views

We humans are governed by our free will to choose and make decisions. But, in the world of technology and data brokers, those who can predict and drive one to make decisions, are influencing free will. They are operating behind screens, constantly collecting private information to build a persona and facilitate targeted marketing. Following the recent Cambridge Analytica scandal, the market is under increased vigilance from regulators. With data harvesting becoming popular, the companies are now choosing to use Artificial intelligence and Machine learning that run in the background to collect data and understand one’s behaviour.

Many credit rating agencies and technology-based enterprises are leading data harvesters who are now being investigated. With the explosion of online services and increasing popularity of other online platforms, many others like Google, Facebook, Twitter, and Snapchat are also under investigation, as they form a critical node which gathers various data points into a single view. The rise of connected devices has prompted online behavior to mirror real-world behavior thus, merging online and real-world experience into one and the same. According to an IDC analyst report, growing data vendor sales can triple to $101.1 bn in 2022 compared to $3.1bn in 2017. The industry’s biggest dream is to connect the offline and online world into one and fuse it into a single unit.

Companies like Experian and Callcredit utilize demographic, sociographic, religious, spiritual, lifestyle, cultural, mortgage, property, and insurance data to categorize individuals. Large families are analyzed as this helps understand the behavior and to connect different data sources thus enabling connected algorithms. Though the data is collected as anonymous data, there are pointers that can help identify individuals. The data sometimes includes complete information, such as biometric, eye test, and dental and medical records data. This can lead to predicting medical health issues or cause damage if the data lands into wrong hands. Additionally, the new advancements in healthcare technology are giving access to fitness tracking data. Machine learning and applications of artificial intelligence are making it easier for firms to assemble and tag the ever-increasing data.

Cybersecurity

In this era, there are strict cybersecurity regulations, and firms offering cybersecurity services are aplenty. In an increasingly sophisticated threat environment, the drive for cybersecurity strategies should be focused as organizational culture that promotes security. There are noticeable trends where developers working with leading social media firms are moving to those that dedicatedly secure user data and emphasize on building human value rather than consumer value. Social value-driven organizations should encourage more and more developers to move into cybersecurity and ethical hacking spaces to contribute to the wellness of the society. As technologists and consumers, we don’t have to rally against AI intelligence and machine learning algorithms; instead, the right balance can save the day rather than foregoing innovative technology.

Social view

Humans oscillate between good to bad and being intelligent to idiotic which is natural, however, based on the behaviour and tagging them with a score will only dictate the natural human behaviour not to oscillate. It is perfectly acceptable to be human and behave competent, vulnerable, introvert, extrovert on the same day based on the situations and people. By stereotyping individuals per a certain persona can only help marketers and doesn’t factor in human elements. Can this be treated as human rights violation? If so, is there a need for human rights organizations to come in support for such violation? Instead, we must turn to innovative technology to help bring more freedom, meaning, and happiness in people’s lives.

We must turn to innovative technology to help bring more freedom, meaning, and happiness in people’s lives.

Mine Data Responsibly

We need data that can be leveraged to predict behavior and drive individuals to make smarter decisions. But also the data and connection algorithms should help create balanced human being. Artificial intelligence and machine learning applications should also provide correcting mechanism for better outcomes in life rather than fixing individuals and driving them from bad to worse. For example, a bad credit score should not limit the person from enjoying life. Can data use or misuse the vulnerability of human nature?

The following are the scenarios where data can be mined responsibly:

  1. Data predicting mental health issues
  2. Data predicting medical concerns
  3. Data predicting road accidents and warning the driver with the right set of messages

Start-ups and more soul-driven companies must look at new ideas to secure user data for betterment of not only the physical nature but also the mental and spiritual wellbeing of human lives.


Contact Us
MAX CHARACTERS: 10,000

We will treat any information you submit with us as confidential.

HCL provides software and services to U.S. Federal Government customers through its partner ImmixGroup, Inc. Please contact ImmixGroup, Inc. at HCLFederal@immixgroup.com

We will treat any information you submit with us as confidential.

Sign in to Add this article to your Reading List
Register