Power of Data

Democratisation of Object Data Within the Telecoms Sector

Can telco’s alone deliver the ecosystem to deliver and monetise 5G and IoT? Can we imagine Telco’s as delivering anything other than certain types of connectivity?

Tirath Virdee

Advisor - Data Democratisation

& Artificial Intelligence

2022-03-29
Technical Level: Intermidiate

Tirath Virdee, advisor for data democratisation and artificial intelligence at EyA covers the subject of democratisation of object data within the telecoms sector.

Can telco’s alone deliver the ecosystem to deliver and monetise 5G and IoT? Can we imagine Telco’s as delivering anything other than certain types of connectivity?

In a short time, humanity has come a long way. The printed word is just 650 years old.  It has enabled us to move from being isolated as clans to connecting with ideas and information with outsiders easily and constructively. The possibilities offered by all things being connected are equally revolutionary as it enables the possibilities of a cognisant, conscious, and proactive contextualised environment. The big issue is as to who owns the data about taxonomies and ontologies and their interaction to create business value.

For me the value of anything is in the data that the thing consumes and produces. As one removes the jargon of technology stacks, one can realise the true vectors that enable us to reimagine our businesses.In a span of 30 years, we have developed 5 generations of cellular connectivity with each generation offering significantly improved communication possibilities over the previous generations. We have gone from voice calls enabled by 1G to ability to watch HD video with 4G. We are entering the era of 5G that offers significant capabilities. You will hear time and again how 5G is a game changer for those things that need capacity relief, greater throughput, network splicing, and greatly improved latency. The improved latency enables a greater density of devices that enables real-time function-critical al applications, i.e., connected vehicles, production line, logistics, smart ports, digital twins, smart cities, connected infrastructure, etc. There are various predictions of deployments, but some have cited that in urban areas there can be as many as 1M 5G connected devices per square mile by 2025. 

Reimagining telecoms is about reimagining the possibilities of data. Everything is data and data is everything. The difference is that in this case, the data can belong to things as opposed to people. The Big Tech companies have democratised data to a large extent. Alphabet, for example, have revenue of $150b and 80% of that is in targeted advertising related to data they have collected in relation to people who consume their search facilities. Democratisation of object data and environmental cognisance and monetisation has not happened. People who believe that somehow the big issue with 5G and IoT is standardisation and metatagging are mistaken I believe. Many of those things that hold us back in terms of overarching issues around data governance (safety, security, provenance, ethics) will not hold us back when we talk about data (or information) of things. Increasingly, devices will be generators of data and consumers of data. Especially in the era of Stem-IoT devices and an evolving smart infrastructure enabled by 5G.  The need to democratise this data will take on a dimension that it should always have had for the Telco’s. Telco’s have been singularly incapable of monetising their data and the derived intelligence for anything other than Customer Management.  

Mobile Private Networks will have their data. All objects will have their data. Telco’s need to reach agreements with their clients and users that enables the data to be aggregated so that monetisation aspects related to determining optimisations and looking for exceptions. These enable the ontologies and taxonomies of real-world objects to be determined and virtualisations created to enable environmentally tuned intelligence to be executed. The aggregation of IoT and 5G data will enable Telco’s to offer monetised democratised intelligence. This will be through classification, clustering, manifold mappings, transfer learning, and awesome possibilities of proactive hypotheses generation and contextualised execution. The virtualisation not only applies to the possibilities of digital twins but also to the fact that we will be increasingly virtualised through or anonymised personas in cyberspace with everything that implies for the social, political, and monetary models that make the functioning of that space. That space, I’d argue will be bigger than the physical space. Big data from will grow exponentially, with Edge-AI offering some relief in specific spaces.

I have not come across anyone that is not struggling with their data and the intelligence that can be derived from it. This leads us to consider how to best use data as a force for good? How do we use data as a positive citizen experience and data exchange? This is not only because of culture and structure but also due to ethics and governance. We have an increasing number of CEOs that want to be data led, and yet ALL of them struggle with what that means.  

Data is the legendary alchemical substance. It is the philosophers’ stone. Its discovery is the Magnum Opus. Without it, businesses are like primitive creatures devoid of proactive and constructive action. Data is the prima materia (first matter), the anima mundi (the world soul). It is responsible for the mystical enlightenment and immaculate conception of consciousness, intuition, and intentionality. Reinforcement, compression, and integration of data is intelligence (in a multitude of its forms) as practiced by natural selection and evolution and, for me it is the route to the next generation of businesses and Artificial Intelligence. General business value will be best derived from democratization of data, AI, and environmental cognisance. 

In an increasingly competitive world, businesses need intelligence, purpose, personalities, partner ecosystems and evolutionary desires. The abilities of businesses to capture diverse intelligence and deliver competitive advantage via data will be the difference between survival and evolution. The current critical challenge for businesses lies at the intersection of ethics, data, regulation, innovation, and purpose.

These issues are leading to every network being connected, linking users, applications, and ecosystems across blockchain networks, using multiple trust layers to facilitate interactions and value creation. However, for businesses undertaking this digital journey, there are significant data, technological and people challenges to address, especially if they are seeking to reinvent existing capabilities. Massive volumes of siloed data are clogging today’s manufacturing, supply, distribution, and user experience systems, consequently exposing enterprises to operational risk, latency, high costs and low efficiency.

There are three aspects to democratising data:

  1. Data Parameterisation and Characterisation.

  2. Data Decentralisation using an OS of blockchain and DLT technologies, as well as an independently governed secure data exchange to enable trust.

  3. Consent Market-driven Data Monetisation.

When it comes to connecting assets, there are two features that will accelerate the adoption and usage of data democratisation: decentralized identity management (the likes of which is provided by frameworks such as EyA) and business data object monetization of data ownership. It enables multiple individuals and organizations to identify, authenticate, and authorize participants and organizations, enabling them to access services, data or systems across multiple networks, organizations, environments, and use cases. It empowers users and enables a personalized, self-service digital onboarding system so that users can self-authenticate without relying on a central administration function to process their information. Simultaneously, decentralized identity management ensures the user is authorized to perform actions subject to the system’s policies based on their attributes (role, department, organization, etc.) and/ or physical location.

We have been witnessing the creation of data democratisation platforms, whether in the form of personal data ‘pods’ in the Flanders local authority, Estonian digital government, Dawex, AWS Data Exchange, DataEx., convexglobal, and so on. Any data exchange platform must enable:

  • Cataloguing Services, Exploration

  • Metadata, taxonomy, and ontology

  • Data Context, Quality and Analytics

  • Unified view of data assets

  • Glossary of Data

  • Pointers to single source of truth

  • Data Lineage

  • Decentralised Identity Management

  • Gateway to data ontology

  • Access to all – tokens determine consumption rights

  • Synthetic data for hypothesis generation and proactive reaction 

  • Intelligent Data

Whilst many of these exchange platforms have their weaknesses, they make it easy to subscribe to and use third-party data in the cloud. Data democratisation is the action of making something accessible to everyone: the democratisation of information through technology. Increased interconnection and integration will see multiple new forms of cloud support, providing users with identity and privacy management that spans multiple deployments, projects and clouds and do this in a manner that supports corporate and national governance policies. Access to products and services will no longer be restricted to a company’s ability to sell and distribute but will be implemented via multiple data-exchange marketplaces. Control and ownership of data will continue to be viewed as a currency and managed accordingly. Critically, individuals rather than corporate entities will be able to trade and rent their data in a manner commensurate with the value and purpose of its use. In this model of data exchange, the underlying platform could be a permissioned blockchain network based on connecting Hyperledger Fabric nodes, enabling versioning and validation of a single source by using a suite of stable frameworks, tools, and libraries for enterprise-grade blockchain deployments. This platform would record the data’s origins and changes, and it would be securely shared among multiple parties or on an open registry (a multi-party solution helps to tackle the various challenges posed by siloed data and multi-tier supply chains).

With this is mind, dynamic and complex relationships between any form of data are organically developed without human intervention. In essence, and decentralisation aspect needs to create a layer on top of all the decentralisation ecosystems to enable effective exchange. We need a virtualisation of ecosystems. With such solutions, we can mark data at birth with varying levels of permissions and access control.  When multiple organisations opt into a contract to loan / rent data, no data can ever be seen by the renting party and all analytics are performed within trusted computing technology removing the barrier of trustless computing.  With direct bias toward the semantics of any business data object can be governed throughout its lifecycle. 

We have come up with an AI periodic table that shows the main components in the value chain as well the possibility of constructing data value indices to determine those clusters of data that add value and those clusters of data that will become useless or siloed

In summary, IoT and 5G represents a wake-up call to the industry to reimagine value extraction from things other than connectivity. If this warning is not heeded, others will ear away at the space. These include big tech companies include satellite operators and cable. The battle ground is ideas and innovation. Everything in nature is around evolution, adaptation and environmental cognisance as well as proactive action. The battle ground for business is ever evolving. 

About the author

Dr Tirath Virdee is the Founder and Director of a number of companies and startups and his interest spans many verticals including financial technology. These include Sensory Intelligence Limited, Neura Technologies Limited, Data Alchemy Limited, Xenesis Limited and DRE Digital. He is also the CTO at Data for Good Limited.

He is involved in researching, applying and writing about AI, blockchain, quantum computing and cybersecurity. His primary current interest is in intelligent data, virtualisation, and the environmental adaptation of intelligence. He had been an advisor to a number of national and governmental bodies as well as commercial enterprises on issues related to data, sustainability and issues related to extracting business value from data as well as the application and implication of AI-human hybrids.

He is a permanent member  of the UK All-Party Parliamentary  Groups  on AI and Blockchain  and was an advisor  to the  Scottish  government  on AI  strategy.  He was  Director of Artificial Intelligence at Capita, the Director of Advanced Technology Group at Siemens AG, a senior scientist at Oxford's Numerical Algorithms Group, a physicist in the UK Atomic Energy Authority’s Breeder Reactor Programme, and has a PhD in Engineering Mathematics, a MSc in Radiation and Immunology, and a first degree in Physics.

Tirath is the author of "Data Alchemy: The Genesis of Business Value" and a major contributor to the reference work for the legal profession - "AI: Law and Regulation". He is also an advisor to a number of VCs.

Are you visiting eya.global from outside the UK? Visit your regional site for more relevant promotions and events.