• GAiA can be deployed on Google, AWS, or Azure clouds, or in a private cloud, on-premises data center, or in a bare metal environment.
• Customers can go to the GAiA public marketplace and download models developed by Tech Mahindra and others, and retune and retrain them for their own use.
Leveraging artificial intelligence to obtain better insights, make more informed predictions, and improve operations is a top priority for most mid-sized and large organizations. However, the process can be daunting. Developing models is time-consuming and skilled resources are scarce and expensive, leaving many organizations in search of solutions that will help them streamline the process. AI platforms strive to do just that by providing a comprehensive environment for developing, training, testing, sharing, deploying, and managing AI models.
Tech Mahindra’s AI platform, GAiA, is the enterprise edition of the open source Acumos platform and is based on its latest Boreas release, which was launched at the end of June 2019. GAiA is designed to make it easier for enterprises to build and manage models, and to collaborate with internal and external stakeholders and partners when deploying AI applications. Not only does it provide tools for developing and training models, it also operates a marketplace in which users can share and download AI models to retrain with their own data. Making the platform even more compelling is the fact that it is cloud and infrastructure agnostic; GAiA can be deployed on Google, AWS, or Azure clouds, or in a private cloud, on-premises data center, or in a bare metal environment. Unlike other model management platforms that are tied to a specific cloud, GAiA can be hosted in the environment of the customer’s choosing. This speaks to the growing preference of customers to work with multiple cloud providers or hybrid environments.
Designed to be an end-to-end platform and ready to use out of the box, GAiA provides tools and capabilities that streamline the adoption of artificial intelligence. Customers can go to the GAiA public marketplace and download models developed by Tech Mahindra and others, and retune and retrain them for their own use. They can then upload the models to an internal marketplace and share them with specific users, another department, across a business, or with other organizations through federation. They can also use the model training and test harness capabilities of the platform before downloading and deploying models into a production environment.
Models are downloaded to the GAiA platform as Docker images that can be retrained, retuned, and reused. Since developers aren’t starting from scratch, solution deployment is accelerated. Functions can be moved to business users so there’s less reliance on savvy enterprise developers and data scientists. Additionally, GAiA’s Design Studio allows users to easily drag and drop models to combine machine learning solutions to solve complex business problems. Furthermore, Tech Mahindra differentiates its platform by complementing it with a suite of professional services, which include model and use case development, model deployment and integration; model customization, retraining, and validation; platform monitoring, support, and features customization; adapters and APIs development; reporting; and infrastructure setup.
Tech Mahindra’s GAiA represents a vision of an end-to-end AI model lifecycle management platform; one that is flexible and open, promotes collaboration, and encourages a broader AI ecosystem.
Charlotte Dunlap – Principal Analyst, Application Platforms
• EventBridge advances AWS’ DevOps Agenda
• Cloud rivals are challenged to bundle DevOps and Serverless technologies
Enterprises continue to struggle with application modernization complexities involving new microservices and serverless computing architectures. As a result, public cloud providers are trying to do more of the heavy-lifting of infrastructure constructs through new DevOps solutions supporting event-based workloads.
During the recent AWS Summit, CTO Werner Vogels announced the general availability of EventBridge, which integrates operations’ external data and helps automate the DevOps processes within a serverless model. The concept is attractive to enterprises moving into cloud technologies because Amazon is acknowledging their need for application lifecycle management (ALM) technologies while having that data available, typically in a SaaS format, and tucked into the management services within AWS (e.g., AWS Consule/CLI/SDKs).
AWS EventBridge is a serverless event processing model based on CloudWatch Events and provides the integration between AWS apps and business systems important to operations teams, such as analytics and application performance management (APM). The event bus leverages Lambda serverless functions with the goal of further abstracting infrastructure complexities to DevOps members.
Cloud rivals are beginning to offer similar DevOps solutions, but need to do a better job highlighting and bundling their own event-based functions offerings. EventBridge rivals Microsoft Azure DevOps, formerly called Visual Studio Team Services (VSTS). Microsoft positions the solution’s CICD and pipeline capabilities as complementary to Azure Functions for helping developers create microservices-based applications without having to manage the infrastructure.
The new AWS offering also rivals Google Cloud Platform (GCP) whose DevOps and serverless solutions include Google Cloud Functions and Stackdriver, a monitoring system which provides IT teams with performance data about apps and VMs running on GCP and AWS.
AWS has a reputation for providing innovative cloud services, but lacks in development tools and frameworks that offer guidance and best practices for companies moving into new business transformation territory. Operations teams in general have struggled with moving modern apps into production and tying into those apps proper security, governance, and policy management. During his two-hour keynote speech, Vogels tried to reassure enterprise developers of AWS’ commitment in easing complexity around cloud native app development and deployment. In addition to EventBridge, he provided a recap and some updates of AWS’ key developer tools, including:
– AWS Cloud Developer Kit (CDK), an app development framework for developers of infrastructure as code, provisioned through CloudFormation.
– AWS App Mesh, to improve visibility and network traffic controls associated with services built across multiple types of compute infrastructure.
– Amazon Managed Blockchain, to simplify the creation of blockchain networks under the industry standards Hyperledger Fabric and eventually Ethereum.
– Amazon SageMaker, an AI platform that serves developers as well as data scientists.
Interestingly, Vogels made no mention of the fall 2018 announcement of Firecracker micro-VM technology which targets DevOps teams tasked with managing, governing, and securing the implementation of serverless computing scenarios. The technology was well-received in the developer community for its ability to leverage KVM and boost the performance of containers and serverless infrastructures including AWS Fargate and AWS Lambda. AWS donated FireCracker to the open source community, which triggered a positive response from cloud providers for its ability to further DevOps’ technologies.
Kathryn Weldon – Research Director, Business Network and IT Services – Americas
• AT&T positions its public safety network, FirstNet, not only as a highly significant win and lucrative opportunity, but also as the highest-performing, fastest, most secure wireless communications network for first responders.
• While Verizon may do less marketing, it remains a very strong player in the public sector, with its own benefits for first responders and a somewhat different approach to the market than its rival.
When AT&T won the FirstNet deal in 2017, it was seen as a major coup for the carrier and a big blow to mobile operator rivals. FirstNet is an independent authority within the U.S. Department of Commerce, authorized by Congress in 2012, with the mission to develop, build, and operate a nationwide, broadband network that equips first responders to save lives and protect U.S. communities. In 2017, after an open RFP process, a public-private partnership was forged between the federal government and AT&T. FirstNet agreed to provide 20 MHz of telecommunications spectrum and success-based payments of $6.5 billion over the next five years to support the network buildout; AT&T will spend about $40 billion over the life of the contract to build, deploy, operate, and maintain the network, with a focus on ensuring robust coverage for public safety. AT&T can also use FirstNet’s spectrum when it is not being used by public safety for other, commercial purposes but it must prioritize first responders over any commercial users. As of May 2019, AT&T had connected approximately 600,000 wireless devices to the network from 7,250 agencies, and offers FirstNet on Band 14 spectrum in 600 markets, roughly 50% of its eventual proposed coverage. The operator notes that 50% of these agencies are new to AT&T and were not just upgrades from existing customers. AT&T doesn’t just provide wireless connectivity to first responders (for both phones/tablets/fleets and IoT devices), but offers applications, specialized devices, enhanced security solutions, and satellite options. Flying Cells on Wings (COWS) were recently introduced, comprised of two tethered drones and a trailer equipped with a satellite dish and fiber connections, which are well suited to provide connectivity in hard to reach locations for emergencies such as wildfires and earthquakes.
Although all 50 states agreed to the original FirstNet proposal, this doesn’t mean that AT&T is guaranteed to be the wireless carrier for all public safety agencies. These agencies as well as the local, state, and federal organizations to which they may report are free to select what every carrier they want. They can also mix and match, with some lines or functions provided by an alternative carrier for backup, diversity, or coverage issues.
Verizon has always been strong in the public sector and in the public safety segment. It too has a dedicated core, with which it can provide prioritization and pre-emption for crucial communications. Verizon notes that it holds a more than 450,000 square mile network coverage advantage over AT&T and that it has partnered with first responders for decades. Its Responder Private Core is part of its 4G LTE network design, is free to qualified agencies, and intelligently manages traffic between commercial and public safety customers. Other benefits of Verizon in this space include its reputation for network reliability (key for voice and 911 calls as well as data), and its ability to anticipate needs, with more “feet on the street” than competitors. While AT&T may do more marketing, Verizon uses credibility, network reliability, and local personnel as key value propositions and considers itself to be a dominant public sector carrier. It also believes in interoperability between its network and Land Mobile Radio (LMR) or other wireless networks and supports multi-carrier push to talk (PTT). Verizon views FirstNet as a good thing because it makes Verizon “run faster” in public safety, and forces it to innovate. For example, it has a relationship with AXON for LTE-enabled body cameras (as does AT&T) and its 5G labs in DC are designing new public safety capabilities such as drone mapping, VR goggles for firefighting, and remote telecare for hospitals. Its perspective is that AT&T faces financial penalties if they do not get enough subscribers on the network so they may end up with more people and devices getting priority than is necessary. Verizon offers different levels of priority for first responders but it doesn’t give priority to everyone since it doesn’t need to meet thresholds. Verizon also has advanced solutions that it offers to both public sector and commercial customers such as mobile device management, advanced mapping, event notification, a tactical messaging gateway, COWS, ruggedized multi-band devices, PTT, and fleet management.
Not only is there competition for public safety deals between AT&T and Verizon, but Sprint and T-Mobile also offer some solutions, while satellite providers, MVNOs, ITSPs, and specialists such as Motorola Solutions are strong players whose offerings may coexist with, complement, or directly compete against those of the operators. The bottom line is that the market remains competitive. While FirstNet certainly provides AT&T with a unique position in the market, Verizon and other providers still have a strong stake.
Synthetic DNA is seen as a solution to the challenge of how to store rising volumes of digital data generated by smartphones, tablets, and Internet-connected sensors.
Innovations by U.S.-based startup Catalog promise to speed up and reduce the cost of encoding digital data for DNA storage, potentially benefitting commercial adoption.
U.S.-based startup Catalog recently revealed that it had successfully stored all 16 gigabytes of Wikipedia’s English-language text on tiny DNA strands within a laboratory vial, in the latest demonstration of the power and potential of synthetic DNA as a medium for storing digital data. The accomplishment marks a new record for the amount of digital information to be stored on DNA. Catalog used prefabricated synthetic DNA strands to store the Wikipedia data, along with a DNA writing machine, which currently writes data at a rate of 4 megabits per second, but which Catalog wants to make at least a thousand times faster.
Catalog is one of a growing number of technology companies (along with Microsoft, Intel, IBM, and Samsung) that see synthetic DNA as a potential solution to the challenge of how to store rising volumes of digital data generated by smartphones, tablets, and Internet-connected sensors. According to Cisco, the world will generate some 4.8 zettabytes of digital data by 2022, up from 1.5 zettabytes in 2017. The growing volume of data will challenge existing storage technologies such as magnetic tape, disk drives, and flash memory to keep pace with the rapidly expanding storage requirement. The attractions and benefits of DNA as a medium for digital data storage include its longevity; DNA lasts 1,000 times longer than silicon. In addition, DNA offers higher levels of storage density, with a single cubic millimeter of DNA able to hold a quintillion bytes of data.
DNA data storage works by taking digital content that is typically stored using a binary code of zeros and ones and converting it into the genetic code of As, Cs, Gs, and Ts that make up DNA’s chemical building blocks. The converted DNA code is then used to create synthetic strands of DNA, which can be put into cold storage. When needed, the DNA strands can be removed from cold storage and their information decoded using a DNA sequencing machine. The DNA sequence is then translated back into binary format.
However, existing DNA data storage techniques face challenges that include the prohibitively high cost of the DNA sequencing technology and the slow speed at which digital data is converted to DNA and the filed DNA code sequenced and decoded back into digital format. Catalog is addressing these challenges with a method that it claims is faster and cheaper than existing synthesis approaches. First, Catalog separates the process of synthesizing DNA molecules from that of encoding the digital data. Second, Catalog relies on a relatively small pool of pre-synthesized DNA molecules – fewer than 200 – that can be combined in an exponential number of ways. The approach requires less DNA synthesis, speeding up and reducing the overall cost of encoding data for storage.
Last year, Catalog announced that it had raised US$9 million from investors to help commercialize its DNA sequencing and storage technology. And although it has said little about who it expects will use the technology, Catalog is currently in discussions with government agencies, major international science projects, oil and gas firms, and businesses from media and entertainment, finance, and other industries, with a view to lining up pilot agreements.
• Alibaba Cloud’s partnership with Sena on Smart City initiative strengthens the provider partner ecosystem, especially with the domestic players.
• However, Alibaba Cloud’s City Brain still has limited customer references compared to other hyperscalers such as AWS and Azure. The cloud stack such as AWS Outpost and Azure Stack would enable cloud providers to address the data residency and edge compute requirements without having local facilities.
On May 23, 2019, Alibaba Cloud announced a collaboration with Sena Traffic System (Sena), Malaysia’s leading smart traffic controller, to build a smart traffic management in the country. In the partnership, Alibaba Cloud will provide its City Brain solution, cloud computing resources, and talent development programs while Sena will be responsible for the overall design and development of the traffic light systems as well as the deployment.
Alibaba Cloud City Brain
City Brain leverages data analytics and artificial intelligence to continuously self-learn and adapt to the changing environment such as events, traffic. and public transportation. It is successfully implemented in Hangzhou (Alibaba’s hometown / headquarters) increasing the average travel speed by 15% and reducing the travel time by three minutes. In another city in China, Suzhou, the passenger volume on pilot public bus routes has increased by 17%.
Alibaba Cloud’s partnership with Sena shows that its investment to build domestic data centers is beginning to pay off. A smart city deployment often requires the data to be hosted within close proximity locally due to the high privacy and security requirements of the local authorities information. Edge compute is also a crucial part of the deployment to ensure low latency communications between the platform and devices such as traffic lights. Alibaba Cloud is leveraging its local facility to gain the advantage over other hyperscale providers to capture the growing demand. While some countries such as South Korea, Singapore, and India have started their smart city initiatives many years ago, many other countries, especially in the emerging markets like Malaysia and Indonesia are still in the very early stage of deployment (e.g., trial or pilot).
With successful references such as City Brain in Kuala Lumpur and local facilities, Alibaba Cloud could use the same advantage in the other Southeast Asia markets such as Indonesia. In Indonesia, other hyperscale players (e.g., AWS, Google) are still building their data centers and strengthening the local presence. With existing facilities there, Alibaba Cloud could use its first-mover advantage to address the data residency and edge compute requirements in smart city use cases and hence grab the growing opportunity in the country. With a population of 264 million, Indonesia has several metropolises with populations exceeding 5 million, that suffer serious traffic congestion issues. Alibaba Cloud could highlight its smart city reference in Hongzhou which has a population of around 7 million to address the similar size cities in Indonesia such as Surabaya, Bandung, and Semarang.
However, the smart city market landscape is getting more competitive. Between hyperscale providers, Alibaba Cloud’s customer references are still way behind competitors like AWS and Microsoft Azure. There are also other providers such as system integrators and carriers who are building up their smart city portfolio and have stronger propositions in professional services and connectivity respectively. Apart from tougher competition, there is a growing trend on hybrid cloud, where hyperscale providers are offering cloud stack for private environments such as Azure Stack, AWS Outpost including Alibaba Cloud’s own Aspara Stack. This would enable the solution to be deployed on-premises or in private cloud environment without the need for public cloud hosted within the country.
Gary Barton – Analyst, Business Network and IT Services
• Whilst AI can replace humans, it often works best when used to enhance what humans are doing.
• AI can deliver significant business benefits, but if implemented unsympathetically it can also cause disruption.
GlobalData’s research indicates that businesses understand that AI offers significant potential benefits in areas such as efficiency, R&D, and staff training, recruitment, and retention. The same research finds that enterprises also see potential pitfalls. Whilst the 5% of respondents in GlobalData’s survey who stated that AI is the ‘beginning of the end of the world’ may have had their tongues in their cheeks, a level of concern is not uncommon. Indeed, KPMG has referred to the concept of ‘Robocalypse Now’. It is also not unreasonable for employees to be worried that AI driven automation technologies will mean job losses because the adoption of those solutions usually does lead to headcount reductions.
Economic realities mean that it will not be possible for most enterprises to ignore AI automation technology. But there are ways that companies can mitigate the impact. A good example has been provided by Vodafone UK with its own call center staff. Vodafone started rolling out AI-powered chatbot technology within its contact center estate as part of its efficiency program. This led to a reduction in the number of human agents needed, so Vodafone also began a re-skilling program to allow those agents who want to retrain as programmers. Vodafone is paying for the training scheme as the mobile operator has, as many other companies have, encountered an IT skills shortage.
Not all job losses can be avoided but adopting a constructive and conscientious approach to deploying AI technologies – both with employees and customers – will bring better results. For customers this means ensuring that customers know that they are talking to a chat bot rather than a human. It also means monitoring dialogue flows to ensure that customers are getting the answers that they need.
With employees, businesses should consider how AI can help their staff to be more efficient. Contact centers again provide a great example of how this can be done. AI-powered assistants can help call center agents find the right answers to a customer’s question. No agent can know every procedure and every detail about every product and/or service. Furthermore, these assistants can make newer employees less reliant on their more experienced colleagues meaning that the most experienced staff are on the front-line helping customers more often. AI assistants can also help convert contact centers into profit centers by suggesting potential sales opportunities and helping agents to find the most appropriate product for a customer.
Knowing the social responsibility position of the vendors you do business with is important; what they do can reflect on you as their customer.
Keep an open mind and do your research; a vendor that aligns with your organization’s ethos and goals will help ensure a better relationship.
Corporate Social Responsibility – Keep It Real
Increasingly, customers are considering the social position of vendors from which they want to buy. Who you buy from reflects on the ethos of your company as well. Nobody wants to be doing business with a vendor perceived as evil or greedy. Therefore, many companies will not publicly reveal which vendors they use internally. The social position of your vendor is probably not even in the top ten requirements, but it should factor in somewhere. If you really want to partner with a vendor, your corporate ethos and attitudes should be at least roughly in the same direction.
Companies are doing more corporate social responsibility, which is corporate-speak for things like charitable donations, employee-led volunteering programs, and even mental health programs for employees. In short, companies have come to realize that making billions of dollars a quarter in profit and not giving back is not a good idea, let alone a good look.
The initial reaction to announcements about corporate social responsibility is happiness, followed by some quick doubt about the corporate motive. It’s easy to think that these moves are purely derived from a need to improve a company’s image rather than an actual charitable act by the company.
There are a couple of things you should look for when considering the social position of a potential vendor: Do they practice what they preach? For instance, do they give to green causes and then pollute extensively in other parts of the world? Do they promote workplace equality but regularly do business with oppressive regimes?
Lastly, and probably most importantly, unclench a bit and be open to the idea that it’s not a sham. We are in the middle of an era with unprecedented access to information… and the disinformation that goes with it. This makes us untrusting and cynical, especially about the motivations of those selling goods and services. Keep an open mind and research what these companies do to give back, both to the community and to their employees.
• Late last week AT&T and Samsung together cut the ribbon on a co-developed 5G Innovation Zone that had nothing at all to do with consumer 5G future opportunities.
• Rather, the new facility, housed within Samsung Austin Semiconductor’s Austin Texas fabrication plant, showcased several ways high speed cellular can both modernize and optimize manufacturing processes.
If you travel a few miles northeast of Austin, Texas, you’ll find among the gentle rolling hills an undistinguished 300-acre facility dedicated to the fabrication of semiconductors (aka computer chips) for networking, high performance computing, IoT, and of course mobile devices. And if you look carefully within the foyer of this 20+ year old foundry, you’ll find a somewhat unassuming highly rectangular room peppered with Ikea-styled demonstration tables and plain black monitors that when considered together scream out in all caps: “5G IS VERY REAL, RIGHT NOW!”
But not as you’d think.
Announced last October as a co-developed project between AT&T, Samsung Electronics America and Samsung Austin Semiconductor, this new rectangular room, officially referred to as an Innovation Zone, has very little to do with mobile phones or mobility in general. Instead it serves as a real-world testbed for 5G technology in support of plain old manufacturing processes. The two companies hope that together the Innovation Zone will help their customers and partners see the immediate value of 5G as a core networking technology upon which they can modernize, optimize and if all goes well innovate.
The trouble with 5G, of course, is that it’s simply not yet available as a global communications standard as a true replacement for 4G standards like LTE. According to GlobalData research, that won’t happen for another few years (see figure 1). Early rollouts from telecom operators like AT&T, Verizon and T-Mobile have been very selective and limited in both purpose and scale. It seems we’ll have to wait some time before we can rely on pure 5G alone to stream Netflix videos while on the morning commute.
But that doesn’t mean 5G can’t make a significant impact right now within select enterprise use cases like industrial IoT (IIoT) for manufacturing, where GlobalData predicts a steady and significant increase in the number of connected, particularly in support of industrial monitoring and metering (see Figure 2).
And that’s why both AT&T and Samsung see their new joint Innovation Zone as a window into both the present and the future of 5G, where as it turns out 5G is in fact all about streaming videos, not on the morning commute, but rather while chips whiz around a factory floor. To explain, here are a few select 5G use cases on display at Samsung’s facility.
• Video Camera-as-a-Sensor – Manufacturers can stream millimeter-wave thermal analysis of both machines and chips in motion, looking for potentially costly abnormalities. AT&T and Samsung demonstrated how this same technology could also be used to monitor engineers on the floor for safety and compliance. Are they overheated, wearing the proper safety gear, or who they say they are?
• Wireless Robotic Instrumentation and Control – Manufacturers can free robotics from wired constraints/costs using highly portable and affordable sensors. The vendors showed an example here running Google’s Fuchsia OS on a coin-sized sensor — a potentially huge deal for manufacturers struggling to modernize large, aging facilities.
• Mixed Reality Maintenance – Plying virtual and augmented reality goggles, the two firms showed how a maintenance worker could navigate a tricky repair process with an augmented overlay displaying process checklists, circuit board diagrams and real-time video instructions. Aside from the usual value in providing an overlay of real-time, data-driven knowledge, the key here is the potential for knowledge transfer among engineers — something immensely practical when you consider that many line engineers must rotate out every 15 minutes in some environments.
• Real-Time Big Data Insights – AT&T and Samsung showed how 5G can facilitate high volume, low latency measurement and sensor control. This may not sound impressive at first, but when you consider that many manufacturers only instrument every machine and payload (e.g., a computer chip) intermittently, the ability to suddenly afford to instrument “everything” is truly transformational. Predictive maintenance AI models, for example, can run at a much, much higher rate of success with defective payloads and potentially defective machines more consistently and readily detected and identified respectively.
Obviously, the first beneficiary of the new Innovation Zone is going to be Samsung itself. The Austin-based fabrication (and recently certified foundry) facility is very much looking forward to these potential 5G benefits. The last example (instrumentation insights) is of particular interest to the company as at present it builds its predictive models instrumenting only a percentage of its machines and payloads.
When you consider that a given fabricated semiconductor can spend up to 60 days traversing the sizable facility, any single unit itself could hold the key to revenue lost or saved. Full instrumentation is the goal, but to date the mostly wired- and WiFi-based instrumentation network at Samsung Austin Semiconductor falls short thanks to WiFi dead zones and the prohibitively high cost of wiring up every single station and chip. With 5G at the ready, Samsung intends to right those wrongs and usher in a new era of not just optimization but also potential innovation. GlobalData will keep close tabs on their progress.
Consumers are becoming aware that their personal data is being mined and misused. They will demand changes and control.
Companies, starting with IT departments, need to get in front of this trend and become more customer-conscious about personal data and privacy by giving customers control and choice about how their data is used before laws and regulations make it no choice at all.
The definition of ‘me’ is expanding. ‘Me’ used to be about personal identity and one’s physical person, perhaps even extending to the immediate family around you. ‘Me’ is getting bigger, though, and extends to a lot more things. ‘Me’ is now also anything about ‘me’ including metadata about me. ‘Me’ is the data I generate from just living, the things I do, the products I buy, the music I like to listen to, and the entertainment I enjoy. ‘Me’ is browsing habits, daily habits, the places I go, the things I stop and look at in stores; my preferences for temperature, color, and foods; even my face, my eyes, my fingerprints, the patterns of veins in my hands.
In the era of AI and big data, this data is being harvested and, in many cases, exploited. Vendors display with glee the precision with which they can track people’s movements via WiFi or cellular and add AI processors to cameras to improve facial recognition. New technologies are being created and revealed every day, all designed to take a better digital profile of ‘me.’ There is a growing recognition among consumers that while it once seemed harmless to trade personal data for access to applications and services, the disparity of power has grown to the point of being threatening to consumers.
As IT managers and industry professionals, what can be done? The vendors in our industry are not helping, ignoring or downplaying ethical considerations and the broader trends of consumer interest: “We can’t control what people do with our tech.” Business management and marketing at the companies we work for all ask IT: “How can we monetize the data we collect?” This puts the company and its employees in a precarious position both ethically and professionally. What be done?
Warn, educate, and advise your employers. The EU’s GDPR was a shock to some organizations, so start there. Emphasize that this isn’t just a meddlesome government regulation and remind them of how painful it was to get into compliance and remain there. GDPR was just the tip of the spear. Customers are going to increasingly demand to know not only how and where companies are keeping their data, but also who they are sharing it with, and they will demand the right to delete it or not have it tracked to begin with. Prepare for looks of incredulity, doubt, and outright mockery. The gold rush to monetize everyone’s data is past, and although it built quite a few of our tech unicorns, those days are over. People are beginning to delete apps and services once thought to be indispensable and learning that the sky does not fall. Have executives of these companies ask around: how many have deleted social media apps that everyone used to use? That’s the sign that people are beginning to understand the value of their data and privacy. Soon, they will demand control.
First, IT managers must familiarize themselves with the issue of privacy and data ownership. It doesn’t matter how you personally feel about the issue; it’s something you need to be familiar with so you can guide your company through these choppy waters. Then, educate your staff. This isn’t an exercise that can be done with a single meeting or handout. It will take time and discussion. Then, with your staff, make a list of areas where the company may be in jeopardy around privacy and data. Take it to management, and again, this isn’t going to be easy or quick. There isn’t a silver bullet or video you can show people to get them to understand. Time and consistent work are needed.
Make your company the leader in consumer-conscious data protection and privacy. You will gain customers and business in the long run. Companies which give that control to customers and communicate it can win in the marketplace. This means no more unreadable, 20-page click-wrap agreements written in Old High Legalese, but instead clear, concise, easy-to-read terms outlining their right to not be tracked and to have their own data deleted.
Companies need to do this and need to do it now. Or, it will be done to us via regulations and laws. Companies that aren’t prepared and customer-conscious about personal data and privacy will be the ones that can’t adjust and fail. Be the tireless advocate your company needs.
Newly published research shows language in Facebook posts can be a more accurate tool than demographic data for helping medical professionals make a diagnosis.
The Facebook data is particularly effective in shedding light on certain health issues including diabetes and mental illness.
Facebook has been under fire for years for everything from the Cambridge Analytica scandal to the platform’s part in aiding the dissemination of false information about the Rohingya Muslims that led to the deaths of thousands in Myanmar. Though it is sometimes derided as a tool that does more to isolate than connect, newly published findings by researchers from Penn Medicine and Stony Brook University show Facebook posts can provide important clues to puzzle out a number of medical conditions including diabetes, depression, and psychosis.
The research applied natural language processing and other tools to analyze more than 20 million words posted on Facebook by 999 volunteers to assess whether they were accurate indicators of medical issues in 21 categories. The results showed language in these posts can play a quantifiable role in helping medical professionals accurately diagnose their patients. Researchers noted that Facebook posts provide insights into the lives of patients that are particularly important in assessing patients for mental health conditions including alcoholism and anxiety.
In some cases, researchers said the language in Facebook posts was “significantly more accurate” in medical diagnostics than demographic data alone. But, when used in conjunction with demographic information, the social media data was a more accurate diagnostic tool than the Facebook posts by themselves.
There is a precedent for leveraging social media and adjacent technologies for medical purposes. Twitter feeds have been used to identify disease outbreaks, and new research indicates that Google’s smart speaker, Alexa, may be very accurate in identifying if a user is having a heart attack based on breathing patterns.
However, there are just as many questions and controversies about the application of this data in healthcare. Concerns around privacy and informed consent may quell enthusiasm for widespread diagnostic use of social media data. In addition, researchers recognize limitations to the types of conditions the data was able to help identify, further mitigating the usefulness of technology in diagnosing medical conditions.