testpage

Neu.ro joins the AI Infrastructure Alliance

Neuro is humbled to be a Founding Member of the AI Infrastructure Alliance, working with 25 of the world’s most innovative companies to build the canonical tech stack for AI/ML.

Our partnerships in the Alliance will help to create a Canonical Stack for AI by driving strong engineering standards and creating seamless integration points between various layers of the AI infrastructure ecosystem. 

The AI and ML space currently lacks a standard set of tools and solutions, blocking data science teams from sharing their work and collaborating across the world. Rather, there is wild proliferation of proprietary, cloud lock-in solutions that benefit individual companies, but not the data scientists and engineers building the AI applications of today and tomorrow. The Alliance came together to help those data science teams break out of lock-in so they can build on top of a standardized, open platform that works across all of their environments.

“Time and again, I’ve seen development teams get excited about the potential of AI to transform their business and applications, only for them to get stopped dead in their tracks by a fragmented and confusing array of technologies with little to no integration,” said Dan Jefferies, Director of the AIIA. “Despite a massive surge of partial solutions, no single tool exists that lets teams leverage the true power and potential of AI. The AI Infrastructure Alliance will help create clarity in this confusing space by building a cohesive framework and bringing together leaders and innovators to help set the standard for how data science teams build models now, and into the future.”

The AI Infrastructure Alliance provides a range of benefits to member companies including opportunity to:

  • LEARN – Individual data scientists and data engineers get together in the real world and the digital world to connect, network, learn, share, and find jobs and collaboration partners.
  • CREATE – AI infrastructure software creators will work with other vendors to define engineering standards and drive adoption for their tools.
  • INTEGRATE – Solutions Integrators will help enterprises incorporate AI/ML into businesses and products.
  • ADOPT – Engineering and C-suite leaders will receive guidance and support as they look to make platform and tooling decisions for their teams.
  • NETWORK – Thought leaders, VCs, industry analysts and others can keep up to date with the latest industry trends, discover best practices, network and exchange information with their peers.

“Creating a place for top AI companies to work together will speed development of the infrastructure that businesses really need to make the promise of AI a reality,” said Joey Zwicker, Co-Founder of Pachyderm, a founding member of the AIIA. “As the Canonical Stack comes together, it will vastly reduce time to value for any company, in any industry, that’s leveraging AI across their business.”

Core founding members include Pachyderm, Seldon, Determined AI, Algorithmia, Tecton, ClearML by Allegro AI, Neu.ro, ZenML by Maiot, DAGsHub, TerminusDB, WhyLabs, YData, Superb AI, Valohai, Superwise.ai, cnvrg.io, Arize AI, CometML, Iguazio, UbiOps, and Fiddler. These companies have raised over $200M in collective venture capital funding from top firms including Andreessen Horowitz, Sequoia Capital, GV, Benchmark, NorWest Venture Partners, Madrona Venture Group and Gradient Ventures.

Neu.ro joins the AI Infrastructure Alliance

Neuro is humbled to be a Founding Member of the AI Infrastructure Alliance, working with 25 of the world’s most innovative companies to build the canonical tech stack for AI/ML. Our partnerships in the Alliance will help to create a Canonical Stack for AI by driving strong engineering standards and creating seamless integration points between various…

AI Transformation Takes a Village

Arthur is an AI Strategist focusing on Digital Transformation, hardware and cloud infrastructure. Working with industry leaders in the GPU, cloud services and ML software sectors, he is dedicated to helping startups and enterprises find the hardware and cloud solutions they need to responsibly scale AI Transformation. Arthur takes a hands-on approach to engagement of tech and business teams to facilitate the development of specific AI/ML use cases, educate stakeholders, select ML technologies, and integrate AI/ML solutions into business processes.

Arthur’s partnership mission is focused on growing Neu.ro’s ecosystem to include the world’s most innovative technology partners, establishing a network that provides our clients with all the resources, technologies and people required to effectively and safely scale their AI strategies.

Arthur is passionate about learning and is most happy in a room of smarter people developing solutions to hard problems.

Certifications:
MIT Sloan Executive Education – Digital Transformation
Columbia University Executive Education – Executive Data Scientist
AWS Fundamentals Specialization

ai village

AI Transformation Takes a Village

Arthur is an AI Strategist focusing on Digital Transformation, hardware and cloud infrastructure. Working with industry leaders in the GPU, cloud services and ML software sectors, he is dedicated to helping startups and enterprises find the hardware and cloud solutions they need to responsibly scale AI Transformation. Arthur takes a hands-on approach to engagement of…

Synthesis AI

A San Francisco-based AI infrastructure company needed robust MLOps to unblock scaling of their synthetic data platform.

syntesis ai

Synthesis AI

A San Francisco-based AI infrastructure company needed robust MLOps to unblock scaling of their synthetic data platform.

Revenue Grid

An AI-centric CRM company sought to unblock production scaling of their NLP pipelines for email integration, automated sequencing and guided selling.

Revenue Grid

Revenue Grid

An AI-centric CRM company sought to unblock production scaling of their NLP pipelines for email integration, automated sequencing and guided selling.

MegaFon

A leading telecom company needed a complete AI/ML development environment deployed within their existing cloud infrastructure. They got it within 2 months.

megafon logo

MegaFon

A leading telecom company needed a complete AI/ML development environment deployed within their existing cloud infrastructure. They got it within 2 months.

Rethinking CRM with AI

AI has the potential to improve the effectiveness of almost every aspect of CRM, from lead scoring and contract management to account management and forecasting. AI also has the ability to fundamentally change the nature of CRM from a sales tool and system of record to a constant companion in the sales process and system of engagement, guiding both salespeople and managers and providing new insights into buyer signals that will improve processes and allow sales and sales teams to scale as never before. 

CRM has long been fundamental to sales. According to Capterra, 65% of all companies adopt a CRM system within their first five years. Competitor research has shown that a CRM typically increases revenue per salesperson by over 40% and that the highest performing salespeople use CRM the most. Furthermore, investment in CRM has been shown to have one of the highest ROIs in technology, with an estimated over 8x return for every dollar spent and an average time to ROI of 13 months.

In their original form, CRM’s were largely tracking and organizational tools consisting of a contact database used for storing leads, prospects and customers together with a basic activity-logging system for tracking all forms of contact with the sales organization from phone calls to emails to meetings. Add to this the power of scheduling and reminders in line with best practices and sales strategies and it is easy to see how CRM became the nerve center around which modern sales organizations are built. It is now part of the essential equipment of sales teams, along with email, the telephone, the winning smile and the firm handshake.

From this powerful initial base of functionality, CRM quickly expanded to incorporate other elements of the overall workflow of sales organizations, including: lead scoring, forecasting, account management and contract management, and also even subsumed related software categories, such as CPQ (configure, price, quote).

In recent years, CRM has seen a major jump in effectiveness with the addition of three new technologies: mobile, social and more advanced analytics functionality. 

According to Nucleus Research, 74% of companies in a recent survey were using mobile CRM applications, while 47% were providing integration of external social networking data to their sales teams. They further found that adding social features and mobile access to CRM applications increased the productivity of sales people by 26.4%

Mobile CRM had existed in some form since the nineties, but with the advent of the smartphone, it truly came into its own as feature-complete mobile native solutions fit seamlessly with salespeoples’ on-the-go lifestyles and work habits.

Analytics solutions have likewise seen deeper integration with CRM systems in recent years in such a way as to dramatically improve its effectiveness. Predictive analytic capabilities such as forecasting sales or the likelihood of prospects to become paying customers now allow managers to more efficiently deploy their sales capacity and focus on areas most likely to generate revenue in a given quarter.

Finally, social integration has enabled companies and salespeople to become more accessible on the most popular social networking platforms as well as providing a wealth of new data for targeting and analytics. In a recent survey, Nucleus Research found that adding social features and mobile access to CRM together resulted in over 25% increased productivity for  sales people. 

None of these advances have fundamentally changed the nature of CRM as a system of record, however. At heart, the CRM is still little more than a repository of customer information with sophisticated scheduling and communications tools. The best practices and habits of the most successful salespeople (who are able to most successfully use these tools) still exist in the realm of know-how and intuition, outside of the bounds of the software itself.

This brings us to the latest major leap forward in the evolution of CRM, currently underway via the integration of artificial intelligence (AI) throughout the entire CRM technology stack.

AI in CRM

Gartner recently predicted that by 2020, 30% of all B2B companies would employ some kind of AI to augment at least one of their primary sales processes.1 A strong indication that this forecast level of AI penetration in CRM may have already come to pass is visible as we are already seeing numerous areas where deep learning algorithms are being leveraged by CRM providers to improve the efficiency of sales teams, including:

  • Pricing Optimization: deep learning algorithms operate on detailed data of prospects and the competitive environment (such as client’s industry, client size, whether a company is public or private, the level of decision-makers involved, deal history with firm, number of competitors, etc.) to recommend pricing most likely to win a deal.
  • Forecasting: deep learning systems have the potential to improve forecasting accuracy in order to better understand opportunities for cross-selling and up-selling with existing clients. 
  • Lead Scoring: combining client contact and spending data with social media postings and past deal history in order to rank leads in the pipeline according to their chances of closing successfully.

AI can also assist in automating almost half of administrative sales work, such as processing sales orders, managing customer accounts, performing data entry tasks and arranging sales appointments. Using AI-driven tools and automation to reduce the time required for reps or admins to complete these tasks promises to free up crucial budget, giving sellers more time to prospect, find new revenue and upsell existing clients.2

The most important change being ushered into the world of sales automation by AI, however, is not in the form of individual AI-driven upgrades to existing CRM functionality, but rather a more fundamental change in the utility of CRM and the way it is used.

This new evolution of CRM systems enabled by Artificial Intelligence has come to be called Guided Selling and promises to completely change the relationship between salespeople and CRM. 

Guided Selling

Whereas previously CRM represented a system of record for customer and deal related data, under the new paradigm of guided selling, CRM has evolved into a system of engagement that is able to function like a personal selling consultant and coach. In this way, guided selling enables the CRM to prescribe optimal sales execution steps in order to enforce more efficient sales processes.

Neu.ro client Revenue Grid conceptualizes guided selling as a system of adaptive data-driven notifications that give reps step-by-step guidance on each deal.

According to market leader Salesforce, guided selling gives a rep a better way of selling to customers by recommending paths or next steps based on data, not instinct. Likewise, a recent Bain & Company report states, “Guided Selling analytics can proactively recommend actions that sales reps should take.” 

Gartner predicts that “By 2024 at least 51% of sales teams will have, or plan to deploy, algorithmic-guided selling”.

With Guided Selling, the CRM coaches teams on where to focus in their current pipeline, shows how fast or slow deals are moving along and recommends concrete next steps and actions. These AI-generated prescriptive next best actions tell sellers what to do to close deals and prospects as quickly as possible, even in situations that would ordinarily require a lot of human intuition and educated guessing, such as what to do next on a complex B2B deal.

One key element of guided selling is providing sales teams an opportunity to capitalize on the sales data that they have been collecting over many years in their traditional CRM systems. Another aspect is the automation of sales activity capture and sales pipeline management, which ensures that the CRM has enough data to operate and frees reps to spend more time selling, leading to faster close times, larger deal sizes and increased forecast accuracy.

Finally, a key advantage of guided selling is that it forces sales teams to come to terms with the difference between their previously defined sales process and what historically actually takes place in moving deals forward. In many cases, the sales team may realize it is not driving the sales process as it had assumed, but is merely reacting to buyer signals along the way and that the buyer is leading the process. Guided selling can make sense of these complex signals and quantify what had previously been relegated to the realm of salesperson’s ‘intuition.’

Furthermore, once these processes have been quantified, they then become repeatable and scalable as the number of deals completed and the size of sales teams grow.

For more information, see Neu.ro’s Case Study with client Revenue Grid to learn how we add value to AI transformation for leading CRM providers. 

ai crm

Rethinking CRM with AI

AI has the potential to improve the effectiveness of almost every aspect of CRM, from lead scoring and contract management to account management and forecasting. AI also has the ability to fundamentally change the nature of CRM from a sales tool and system of record to a constant companion in the sales process and system…

AI for Regional Cloud Providers

The Critical Window for Regional Cloud

US-based tech giants have leveraged their first-mover advantage to develop a commanding lead in the fast growing $250+ billion global cloud computing market in terms of revenue and market share. As these Tier-1 giants continue to grow internationally, they represent a significant threat to regional and international cloud service providers. 

That being said, we believe that cloud services remains a strong and defensible growth business for forward-thinking and well-positioned Tier-2 providers, regardless of location or current market share.

Significant near-term growth opportunities do in fact exist for international public cloud service providers, but it is essential that these players position themselves quickly to defend and grow their market share as hyperscale cloud providers continue to expand both their global data center coverage and enterprise service offerings.

According to Gartner, countries such as Poland, Brazil and Australia are already among those with the highest percentage of total IT spending on Public Cloud Services; and countries such as Russia, Indonesia and India are among those with the highest growth rates in IT spending on cloud.

There are several factors driving demand for independent Tier-2 providers – which we define as regional, country-level and niche cloud service providers primarily outside of the US. Most important among these are: data security, performance, and customer preference. We would add to this list the growing trend of Artificial Intelligence (AI) and Machine Learning (ML), which is an underappreciated driver that will be key to maintaining competitiveness and unlocking future growth. 

As AI technology evolves, AI workloads are demanding ever greater amounts of specialized compute for both training and inference. These workloads often come with significant data privacy and data location issues, all of which present natural opportunities for well-placed regional cloud providers. But in order to capture the opportunity presented by AI, service offerings must be comprehensive and designed to ensure customer success – or regional cloud providers could find themselves losing out to fast-growing Tier-1 competitors.

The Local Advantage: Data Privacy, Performance and Preferences

As more and more businesses opt to migrate their legacy applications and data to the cloud, or choose cloud-native services to host their core business platforms, they are confronting a range of regulatory and compliance obligations regarding data security, which can limit their options in choosing a cloud service and storage provider. 

For many businesses, it is critical to choose a cloud solutions partner who can offer in-country data storage, with the benefits data sovereignty brings.

The UK Financial Conduct Authority (FCA), for example, requires that firms ensure their cloud providers do not store data in jurisdictions that may inhibit effective access for UK regulators.

Transferring data between the US and EU also comes with significant legal and budgetary requirements, as US companies are required to register annually with the EU’s ‘Privacy Shield’ system and nominate a dedicated ombudsperson to respond to any consumer information request directly.

In Russia, an amendment to Russia’s Personal Data Protection Act went into effect in 2015, that requires that personal data of Russian citizens must be stored on Russian servers.

Brazil, Mexico, Argentina and Colombia all require that cross-border data transfers be restricted to countries whose data protection laws are of a similar quality to those of the home country – potentially creating legal issues as regulations regularly change in various data domiciles abroad.

All of these issues can be avoided by domiciling sensitive data in its country of origin.

Another key issue for clients choosing among cloud and hybrid service options is the performance of their applications in the cloud environment. For a number of services, including data streaming and voice applications, latency within the network is primary among those parameters. In many cases, same-country providers have a distinct advantage over Tier-1 providers in this area. 

Furthermore, in data-centric AI/ML applications, co-location of large training databases with ML-accelerated hardware, such as GPUs and TPUs, can be crucial to efficient  scaling of production applications. Again, same-country independent providers are at an advantage to Tier-1 giants half a world away. 

Finally, the selection of a cloud provider is a mission-critical decision for clients, and is associated with high costs and a long time horizon. Here, forward-looking cloud service providers can out-perform Tier-1’s with localized relationship management, customer support and tech support.  

AI in the Cloud

According to Gartner’s 2020 CIO Survey, enterprises expect to double the number of AI projects in place within the next year, and over 40% of them plan to deploy new AI solutions by the end of 2020.1

As early as 2017, an MIT survey showed that ML and AI were the fastest growing cloud-based workloads.2 More recently, technology market research firm Omdia forecast that by 2025 AI will account for as much as 50 percent of total public cloud services revenue.3

As Omdia notes in their 2019 AI Market Forecasts report, AI adoption in the cloud means that, “essentially, another public cloud services market will be added on top of the current market.”
And these workloads are by no means a lock-in for Tier 1 cloud providers. GPU maker NVIDIA’s CEO, Jensen Huang, recently noted that the “flood” of AI workloads headed for the world’s data centers are proliferating more and more beyond the hyperscale cloud giants. According to Huang, ML will continue growing as a portion of the total computing power used inside cloud data centers, and he expects to see ML in data centers operated in every country and industry in the near future – from managed service providers to banks. “It’s going to be everywhere.”4

By 2022, IT market researcher IDC estimates that spending on AI systems will exceed US$79 billion globally, growing at an annual rate of 38 percent.5 But as we move further into the future, market research firm Omdia projects the AI software market will reach $118.6 billion by 2025. 

Furthermore, researchers see spending spread broadly across 30 major industries, including telecommunications, energy, manufacturing, banking and insurance; but also in the future among more highly geographically dispersed and potentially disruptable industries such as advertising, education, legal, sports, consumer, entertainment, gaming, retail, fashion and healthcare.

According to Gartner VP David Cearley, “Over the next 10 years, virtually every app, application and service will incorporate some level of AI.” 

“The global AI market is entering a new phase in 2020 where the narrative is shifting from asking whether or not AI is viable, to declaring that AI is now a requirement for most enter- prises that are trying to compete on a global level,” according to Gartner. They also add that “through 2023, AI will be one of the top workloads that drive IT infrastructure decisions in large enterprises.”6

And while AI implementation is still accelerating within large enterprises as they move from pilot projects to broader implementation, the next wave of adoption will see a long tail of small and medium sized businesses, local governments, startups and nonprofits begin to move the needle on overall spending in the sector.  As in previous enterprise technology cycles, AI adoption was initially driven by Fortune 100 companies, major global banks and hyperscale technology leaders. But, as AI-related technologies mature and use cases continue to be proven out, we are now seeing the scope of AI adoption broaden significantly.

This next leg of growth in AI adoption will have especially outsized implications for cloud service providers outside of the traditional big three of AWS, Azure and GCP.

Cloud AI Adoption: High volume, high performance and high sensitivity

In a recent survey of enterprise AI practitioners by supercomputer maker Cray, 32% of respondents said that data locality and security was their top priority in choosing infrastructure for enterprise AI workloads, but an even higher 39% named performance as the top priority.7

As we look at the requirements of developers who are seeking to move their AI/ML pipeline to the cloud, we must look beyond basic deployment questions and speed metrics and into broader ML workflows, including: data preparation and analysis, model training and evaluation, and tracking and understanding of ML artifacts and dependencies. 

Creating a cloud-based AI pipeline requires significant MLOps (DevOps for AI/ML) expertise and manpower to create, manage and maintain the environments for these ML workflows. Abstracting away the specifics of the physical and network infrastructure with sophisticated systems such as Kubernetes or Docker Swarm, which require specific high-level expertise to setup and maintain. 

Another key consideration that is often overlooked by companies early in their AI journey is co-location of data with the AI/ML-focused computational power needed for training.

How Independent Cloud Providers get their piece of the growing AI market

Up to this point, second-tier cloud providers have seen their greatest market successes by operating and competing with the majors on the principles of regional presence, openness and simplicity. And these same principles will also be the key to cloud success in the AI space.

  • Regional presence will be even more crucial in the AI cloud world than it was for traditional enterprise applications and storage, given data security regulations and the need for co-location of massive data storage and AI-accelerated hardware.
  • Openness is a key value required to allow AI developers to use the tools of their choice in developing the applications of the future, and to allow for AI workloads to be transported between on-premise, cloud and hybrid environments. Major cloud providers fail in this respect as they continue to push their in-house solutions, tools and languages in order to force vendor lock-in.
  • Simplicity will be essential in providing a robust and easy to use MLOPs environment that frees expensive and limited developer talent to spend time creating AI solutions and not managing their cloud environments, operations and ML pipelines.

As more and more companies, from startups to enterprises, scale AI into production, there will be massive demand for AI/ML infrastructure and services. 

While cloud giants are seeking to conquer this space with proprietary platforms and services that lock users into their offerings and toolsets, local cloud providers have a unique opportunity to alleviate data sovereignty and security fears while providing users with openness, simplicity and regional presence that fit their customers needs and allay their fears. It is essential that forward-thinking locals do not surrender this enormous market to foreign Tier-1 cloud giants. AI can and should be the game where the home-team wins.


  1. Gartner CIO Survey 2020
  2. MIT
  3. Omdia (formerly Tractica) and Deloitte
  4. Huang, NVIDIA GTC Keynote 2020
  5. IDC (paywall), IDC (summary)
  6. Gartner 2020
  7. Cray Inc. Survey 2019
ai cloud providers

AI for Regional Cloud Providers

The Critical Window for Regional Cloud US-based tech giants have leveraged their first-mover advantage to develop a commanding lead in the fast growing $250+ billion global cloud computing market in terms of revenue and market share. As these Tier-1 giants continue to grow internationally, they represent a significant threat to regional and international cloud service…

Neu.ro at the Forefront of Fundamental AI Research and Applications of Machine Learning in Healthcare

In the past 4 years there has been an explosion of research on the application of AI in biology-related fields, particularly in medicine, pharmaceutical development and healthcare administration. 

According to the NIH’s National Library of Medicine, published papers on deep learning, NLP, computer vision and reinforcement learning in this field have been growing at over 50% per annum since 2017 and papers published in the last year already account for 25% of all output in this area in the past 10 years.
Specific advancements and new applications of AI technology are being demonstrated in the areas of drug discovery and screening,1,2,3 prediction of drug success likelihood,4 prediction of disease outcomes,5 guidance on how novel molecules may be synthesized in the lab,6,7 image analysis of microscopy,8 improved diagnostics9 and prediction of health outcomes based on changes in nutrition.10

AI Research: Causality and Explainability

We are also seeing fundamental research taking place in two areas of AI that will have outsized effects on healthcare use cases in the future: causal AI and AI explainability. 

Causal AI will expand the available scope of AI models from finding correlations in data (and successfully generalizing them to previously unseen data) to making actual causal inferences. This is already being used to improve the ability of models to perform accurate differential diagnoses in a range of medical contexts.11 AI Explainability is likewise crucial for improving the trustworthiness and transparency of AI systems in healthcare.12

Availability of Data

In support of these advances, there has been significant progress in terms of the availability of data that can be used for medical and pharmaceutical research. These include open source datasets such as Stanford University’s Medical ImageNet, a petabyte-scale repository of diagnostic imaging studies used for developing intelligent image analysis systems; the Protein Data Bank (PDB) from Brookhaven National Laboratory, which now contains over 150,000 unique structures; and Recursion’s RxRx datasets, which have hundreds of thousands of new images and cell types being released annually.

Data Privacy

Given the sensitive and protected nature of medical records, however, the majority of data in healthcare applications will likely always remain private. To deal with this issue, research is also ongoing in the area of federated learning, a technique for allowing machine learning to be conducted while maintaining data privacy and secrecy. Research papers on this topic grew almost 5x from 2018 to 2019, and more papers were published on federated learning in the first half of 2020 than in all of 2019.13

Startups and Collaborative Efforts

Much of these research efforts into new applications of AI in the areas of medicine, healthcare administration and pharmaceutical development has been collaborative in nature. We are currently seeing a proliferation of startups partnering with established pharmaceutical companies, companies conducting joint research with universities, as well as academic consortia and industry groups all creating collaborative groups to focus on these topics. These collaborations contain tremendous potential but also present their own challenges in terms of culture, data security and infrastructure management.

Healthcare Administration

It is also important to note that activity in this area is not merely confined to research. There are real world trials of AI-discovered drugs currently in progress – both Sumitomo Dainippon in Japan and Sanofi in France are conducting phase-1 clinical trials on molecules developed using AI techniques.14,15

Much work still needs to be done, however, in order for this new AI revolution to truly be successful – the regulatory environment must be updated to allow for approval of new AI-developed drugs, services and devices. Insurance policies must also be modified in order to allow for reimbursement for AI-enabled procedures and AI-developed drugs as well.

Incorporating AI into Research Practices and Workflows

Finally, research practices must be updated for evaluating new AI health interventions. A recent review of 20,000 AI studies found that less than 1% of them had sufficiently high-quality design and reporting. Studies frequently suffer from a lack of external validation by independent research groups, generalizability to new datasets and poor data quality.16

Going forward, it is clear that AI will be used for more than drug identification and disease diagnosis. It will also be used to improve patient outcomes and efficiency across the healthcare industry by identifying workflow optimization strategies, reducing failure of delivery, overtreatment, mispricing, fraud and abuse; also integrating with smart health record organization and retrieval and even improving sales & marketing for healthcare organizations. The picture that is emerging for virtually every member of the healthcare industry is that AI transformation will be central going forward.

Neu.ro Involvement in Pharmaceutical Research and Healthcare

At Neu.ro, we have been actively participating in this exciting period for AI in healthcare via conducting fundamental research, developing and deploying targeted AI solutions, and ML operations and infrastructure setup and management solving a range of operational, privacy and efficiency issues. Our work has taken the form of direct client engagements with both enterprises and SMBs, collaborations with well-known startups in the space as well as joint work with government research organizations.

One recent example is Neu.ro’s development of DeepCycle17, a new technology for modeling the lifecycle of cells with applications in medical and cancer research, that was conducted in coordination with the European Molecular Biology Laboratory (EMBL), Neuromation Chief Research Officer Sergey Nikolenko and Senior AI Researcher Alexander Rakhlin.

“For the first time ever, we have been able to develop distributed representations of cell images that actually have a closed cell cycle progression in time. These representations can be used to identify the ‘cell clock’, i.e., current ‘age’ of a cell, which may have important implications across the medical field,” said Nikolenko.

The DeepCycle method was developed using approximately 2.6 million microscopy images of canine kidney cells, and applying “transfer learning,” an approach to modelling that uses 0knowledge gained in one problem to bootstrap a different or related problem. The team in this case started with a computer vision model pre-trained on public data containing over a million common images, and then refined the model for their cell tasks, which made it possible to distinguish between microscopy images of cells

Furthermore, by using the Neu.ro MLOps Platform for the entire project, Neu.ro was able to manage the entire ML lifecycle for the project, including experiment tracking, hyperparameter tuning, remote debugging, distributed training and model deployment and monitoring. With Neu.ro, researchers were able to streamline infrastructure management, optimize hosting and compute costs and accelerate development and deployment of this important new technology. 

Crucially, the team was able to install, setup and manage the entire ML pipeline on EMBL’s own infrastructure by tunneling in without ever having to set foot in their offices and without the data ever being visible to our team. This means that all data remained on their servers and in their control with all privacy and security measures in place – an important capability with numerous applications in the healthcare space.

Neu.ro’s team was able to install, setup and manage the entire ML pipeline on EMBL’s own infrastructure by tunneling in without ever having to set foot in their offices and without the data ever being visible to our team. This means that all data remained on their servers and in their control with all privacy and security measures in place

Another significant project we completed in the space was conducted in collaboration with our long-time partners Insilico Medicine – with Neuromation contributing to the creation of druGAN, an advanced generative adversarial autoencoder model for de novo generation of new molecules with desired molecular properties. This work demonstrated for the first time the improved effectiveness of using a customized adversarial autoencoder in this context over previous variational autoencoder techniques for accurately capturing the structure of large molecular databases.

In the area of computer vision applications in health, Neu.ro has also completed several interesting projects – including accurate surgical tool identification and health code compliance monitoring, which we will discuss in more detail in an upcoming whitepaper devoted to this topic specifically.

As this exciting area of AI research and development continues to advance, Neu.ro and our team of AI researchers and specialists at Neuromation look forward to continuing to contribute and collaborate with companies in pharmaceutical development and biotech, health maintenance organizations (HMO’s), insurance providers, research universities and government and non-governmental healthcare organizations to move the ball forward in the goal of improving health outcomes for all of humanity.


  1. Cell, February 2020: A Deep Learning Approach to Antibiotic Discovery
  2. Arxiv: PrincasdaSDipal Neighbourhood Aggregation for Graph Nets
  3. Jou___rnal of Medicinal Chemistry, June 2020: Machine learning on DNA-encoded libraries: A new paradigm for hit-finding
  4. BioRxiv pre-print, August 2020: Functional immune mapping with deep-learning enabled phenomics applied to immunomodulatory and COVID-19 drug discovery
  5. Nature Medicine, May 2020: Predicting conversion to wet age-related macular degeneration using deep learning
  6. arxiv: Molecular Design in Synthetically Accessible Chemical Space via Deep Reinforcement Learning
  7. arxiv: Predicting Organic Reaction Outcomes with Weisfeiler-Lehman Network
  8. Neu.ro EMBL release (add link)
  9. Nature, October 2020: International evaluation of an AI system for breast cancer screening
  10. Nature, October 2020: Human postprandial responses to food and potential for precision nutrition
  11. Nature, September 2020: Improving the accuracy of medical diagnosis with causal machine learning
  12. Arxiv, October 2019: Asymmetric Shapley values: incorporating causal knowledge into model-agnostic explainability
  13. Google Scholar search 
  14. IR News
  15. Financial Times, January 2020
  16. The Lancet, Digital Health
  17. Embl.org, October 2020

healthcare ai

Neu.ro at the Forefront of Fundamental AI Research and Applications of Machine Learning in Healthcare

In the past 4 years there has been an explosion of research on the application of AI in biology-related fields, particularly in medicine, pharmaceutical development and healthcare administration.  According to the NIH’s National Library of Medicine, published papers on deep learning, NLP, computer vision and reinforcement learning in this field have been growing at over…

AI Transformation for Telecoms

The Scale of Global Telecoms

The global telecommunications industry is immense – the 54 telecommunications companies on the Forbes Global 2000 list accounted for more than $3.4 trillion in assets and totaled nearly $1.5 trillion in revenues last year. The industry has reached this scale through constant adaptation to new technologies and competition. Today, As telecoms look ahead, it is becoming increasingly accepted that there are two major technologies that they must embrace in order to succeed: 5G and AI. 

5G alone is expected to contribute $13 trillion to global output by 2035, with most of these gains occurring outside the US. According to PwC, AI could contribute up to $15.7 trillion to the global economy by 2030 – more than the output of China and India combined.  In order to maintain competitiveness, it will be incumbent on all players in the telecommunications industry to incorporate these technologies into their strategic growth plans.

5G’s moment may have already arrived with Apple’s recent announcement of the iPhone 12 with 5G. AI, for its part, is currently less readily associated with mobile and business telecommunications services, but we will show why these two technologies go hand in hand – a double wave of digital transformation that has the potential to unlock growth and cement market positioning for the most agile telecoms and enterprises able to take advantage of it.

Waves of Transformation

Global telecommunications leaders are no strangers to transformation. The industry began to change dramatically in the 1980s with the break-up of regional telecom monopolies under market liberalization in the US, Japan and the UK; and this transformation continued with the dawn of the Internet age in the nineties when the technologies of packet-switching, IP (Internet Protocol) and the world wide web came to the fore. Over this period, telecoms grew from legacy providers of circuit-switched voice communications services into the modern high bandwidth mobile and data services providers of today.

The process of transformation in the telecommunications industry has only accelerated in recent years as internet giants such as Google and Facebook and cloud providers such as Amazon and Microsoft now operate global services using their own hyperscale data centers built using commodity hardware and homegrown software. As telecommunications companies are forced to compete with these newer market entrants, their capital-intensive, technology-focused model has given way to a user-centric service delivery model enabled by a next gen infrastructure.

Modern telecoms today primarily focus on growing wireless revenue by increasing their subscriber base and ARPU and by retaining high-value subscribers. They furthermore seek to grow strategic revenue with various pay-as-you-go service offerings, and finally to reduce their overall cost structures. Once implemented, 5G and AI will provide telecoms with powerful new capabilities for accomplishing all of these goals.

In order to attract and retain high-value corporate accounts and increase strategic revenue, modern telecoms must offer their corporate customers a variety of innovative services. British Telecom, to use one example with particularly well developed enterprise IT offerings, offers its customers a fully-fledged enterprise communications and collaboration platform, vertical- focused customizations for the public sector and retail and utilities industries, and cloud and data security services. They also offer a platform for managing distributed workforces (based on software they developed and dogfooded themselves to manage their own tower maintenance operations).

But service offerings must continuously evolve in order to retain subscribers. As communication and networking technologies inexorably improve, telecoms must adapt and upgrade their infrastructure and procedures in order to provide customers with the latest services and unlock the additional value those technologies provide. Machine Learning Operations (MLOps) is a key elements of these upgrades.

Transformation in the telecommunications industry is never complete, but comes in well defined waves – some of which are well understood and expected and others which are only captured by the most agile and forward thinking. The next wave of transformation for the industry is 5G and we would argue that a no less important and closely-related upcoming transformation wave for the telecom industry will be AI.

From Swells to Waves – 5G + AI

5G will undoubtedly be a major step forward – up to 100x faster data speeds and network latency lowered by a factor of five will provide instantaneous access to services and applications. 5G envisions a heterogeneous and software-defined network that can integrate massive numbers of lightweight sensor nodes and a diversity of transmitting and receiving devices such as macro and small cells to provide pervasive connectivity indoors and outdoors, leading to data volumes many thousands of times higher than today.

Proposed use cases for 5G include industrial and consumer IOT, gaming, pervasive 2D and AR/VR media and social experiences, intelligent logistics, autonomous and smart vehicles and many more.

As the CEO of Ericsson Börje Ekholm recently said, it is likely futile to try to identify the killer app for 5G at this early stage. “By the time we identify the killer app”, said Elkholm, “it will actually be too late. Many looked for the killer app in 4G, however we were not able to envisage a world where a ride-hailing app would be the normal way of ordering transportation.” Shopping and watching movies on mobile devices was also hard to imagine when 4G LTE development was underway, he added.

The second major transformation wave confronting telecoms today is AI. AI represents a major opportunity for telecoms to improve their internal efficiency and overall competitiveness, to provide compelling value-added external service offerings as well as to facilitate their clients’ own AI transformation efforts. 

Internally, telecoms can and are using AI to optimize their networks, provide improved customer service, conduct maintenance operations and prevent fraud. Externally, telecoms are uniquely positioned to facilitate and extend the AI transformation efforts of their clients.

According to Boston Consulting Group, AI should be central to telcos’ transformation because it will enable them to better cope with fluctuating demand levels, adjust to supply chain disruptions and adapt to sharp shifts in consumer confidence and priorities. They recommend that fast moving telecoms should reinvent themselves by embedding AI at the very heart of not just their products, but also their key processes by using the technology to reimagine their existing operating procedures throughout the organization.

AI will further assist telcos to drive growth by providing digital services in areas such as healthcare, media, entertainment and third-party analytics – but they must be careful to avoid potential risks to customer privacy that may wipe out years of building customer trust.

Overcoming Data Privacy Issues + Unlocking IoT

To solve these potential data privacy issues of AI on mobile networks, operators may opt to implement a new framework called ‘federated learning’. Federated Learning allows for neural networks to be trained locally on the same device where the data is generated/collected. Once trained, the locally trained weights of the neural network are transported to the telecommunication company’s cloud data center where federated averaging takes place using techniques such as secure aggregation and differential privacy to further ensure the privacy and anonymity of the data, and a new model is produced and communicated back to the remote device. This technique will be particularly useful with health-related data to improve diagnostics without breaching the privacy of patients, but many categories of sensitive consumer preference and behavioral data will be relevant.

For less sensitive data, however, 5G could actually make edge computing largely irrelevant, as training AI systems in the cloud would be almost the same as doing the processing on-device. Centralized processing could even be the preferred option in most cases because of higher processing capabilities and less restrictive power budgets.

AI has further synergy with 5G in the context of IoT. The increased bandwidth and volume capacity of 5G has been hailed as a potential pivotal moment for IoT, supporting millions of low-power sensors on the network in both indoor and outdoor environments. These IoT data sources will feed into AI systems, generating future services and efficiency gains – and both centralized and edge AI processing will be employed to support that vision.

An additional option for AI and IoT applications that 5G will present to telecommunications companies is to do processing at the base station or edge router, so called small and large cells – essentially creating a middle processing layer between the cloud and the device known as fog computing. This architecture, while advantageous in terms of privacy and latency, will require significant specialized MLOps capabilities to manage orchestration of distributed computing resources and coordination of training and feedback loops on multiple networks.

AI for Internal Telecom Operations

AI will also be an essential technology for telecoms to manage the new 5G networks themselves. ML will enable seamless automation of network management to reduce operational costs and enhance user experience as traditional optimization techniques are not agile enough to handle the complex, real-time analysis required in 5G networks. ML at the edge can also be used to predict changes in demand and scale up network resources as needed. 

Already, telecoms globally are using AI technologies to optimize their networks, 

allowing for predictive maintenance, self healing and protecting networks from any fraudulent activities. AT&T is combining AI and drone technologies by testing computer vision analysis of automated drone visual data for cell tower maintenance. Customer services and support, agent-guided assistance, personalized marketing, and product bundling recommendations are also being driven by AI. Dutch telecom KPN is already using NLP deep learning systems to analyze notes produced by their call center agents, and using the insights generated to make changes to its interactive voice response system.

Supporting Customer AI Transformation

The final, and possibly most important aspect of AI Transformation for telecoms we will cover is the competitive advantage telecoms can achieve over other technology infrastructure providers by facilitating AI adoption for their customers.

Telecoms have several advantages over hyperscale cloud providers and other non-telco competitors. First, telecoms have large numbers of existing B2B client relationships. Second, they are already entrusted to secure sensitive client data. Third, they are frequently local and regional specialists – with closer client relationships and more years of experience in the markets where they operate than competitors. And finally, they possess large existing infrastructure networks with massive capacity, providing them with an extremely low marginal cost basis for providing new high-bandwidth high-volume data services to customers.

On the other hand, it is also fair to say that the hyperscale cloud companies realized the importance of AI workloads for their strategic growth very early, adding massive amounts of ML-accelerated hardware and investing heavily into their own home-grown MLOps software to ease AI transformation. 

We contend that telecoms (and the systems integrators and hardware manufacturers who supply them) have a massive opportunity to transform their infrastructure and turn the tables on their growing cloud competitors.

In order for customers to take full advantage of the capabilities of 5G and AI, they will require familiarity with agile development procedures as well as significant MLOps expertise to ensure that ML pipelines can be quickly created and maintained. They will also require access to large pools of specialized ML training hardware (most commonly GPUs) that integrate seamlessly with existing storage for efficient AI training and deployment. 

To this end, it is essential that telecoms provide their customers with a fully fledged MLOps Solution, such as Neu.ro, that integrates with their own offerings, combines flexible resource orchestration for ML workloads on their mobile networks and in their clouds with integrated pipeline creation and management tools for the entire ML lifecycle, including data collection and preparation, experiment tracking, hyperparameter tuning, remote debugging, distributed training and model deployment and monitoring.

Automating infrastructure management with an easy to use code-first development environment such as Neu.ro will allow customer AI teams to optimize their infrastructure costs, streamline operations management and freely integrate with their choice of the leading open source and proprietary tools. These capabilities will allow customers to fully take advantage of the combined potential of AI and 5G and guarantee telecom growth and market share for many more upgrade cycles to come.

ai telecom

AI Transformation for Telecoms

The Scale of Global Telecoms The global telecommunications industry is immense – the 54 telecommunications companies on the Forbes Global 2000 list accounted for more than $3.4 trillion in assets and totaled nearly $1.5 trillion in revenues last year. The industry has reached this scale through constant adaptation to new technologies and competition. Today, As…

EMBL

The European Microbiology Laboratory advanced traditional microbiology research methods with state-of-the-art Deep Learning.

EMBL

EMBL

The European Microbiology Laboratory advanced traditional microbiology research methods with state-of-the-art Deep Learning.