How to Approach AI Opportunities in Small and Medium Sized Firms

How to Approach AI Opportunities in Small and Medium Sized Firms

All firms are exploring AI opportunities and working out how to apply AI to their businesses. The widespread belief that AI will be transformational is behind the recommendation that firms make significant investments of time and money in creating the necessary capabilities. Though undoubtedly valid, this advice has large firms in mind – with substantial resources and the ability to spread risk. Smaller, mid-sized firms need approaches that are cost-sparing, that play to their strengths and that are lower risk.

AI as a Source of Opportunities

Artificial intelligence is a source of opportunities for businesses, small and large. The data necessary can be assembled from a myriad of sources. Small business will need to develop the capabilities to tap into AI effectively and to keep up with change and competition. They need to bring their people up to speed. So, how can they best do that?

AI Courses

There are online-learning platforms, universities, and executive-level programs to train people. Courses differ, but most, delivered by academics, include theoretical material that is hard to make relevant to a specific business situation.

AI Skills

Firms can hire new people equipped with these skills. However, there are reports of culture clashes between academic newcomers and pragmatic, experienced incumbents in large firms. This suggests this is an even higher risk option for smaller firms where there is more reliance on close working relationships and tacit understanding.

AI and Change

An internal “AI Competence Centre or Academy” that is part of an internal change programme is an option proposed for helping firms “transform.” Whatever the benefits, it is a level of commitment unlikely to be cost-effective for a smaller business.

AI for Smaller Businesses

These firms need an approach that plays to their strengths; practical, focussed on company priorities, engaging, hands-on, intuitive and stripped of academic jargon. It is entirely possible to convey AI principles that are relevant to practical managers. After all, AI is merely replicating what practical managers already do, but in a way that requires reams of data and hours of computer time to perform millions of repetitive calculations. It is only feasible because computer processing power is cheap.

Autonomous cars are some way off, autonomous companies are too far off to worry about. Experience and judgement, however, are indispensable and small firms have that in spades.

How to Introduce AI

Our recommendation is to introduce AI as just another management tool. Introduce it for a series of selected value-generating purposes. Keep the managers in control and contributing to all aspects of the work, from exploring data and sourcing data to interpreting the results. They can then use the results from AI to inform their experienced judgements. Engaging businesspeople and using AI in its assistive role is a pragmatic way to use AI in business.
Managers being sceptical and rejecting some AI results is entirely healthy. The literature is full of embarrassing examples of AI techniques producing results that are not replicable. Firms using AI to generate benchmark scores for investors, or for advertisers, produce scores for the same benchmark that differ, leading to commercial disputes. Often data issues will cause a degree of uncertainty in the results, and that is one reason why experienced people need to be involved. Making decisions under uncertainty is what they have been doing for years.

Presenting AI Results Visually

It is important to present AI results in a way that everyone can understand. This is the role of data visualisations. These are intuitive. After viewing a series of visualisations of a data set, most people are readily capable of drawing meaning from them and generating hypotheses about the implications and underlying causal relationships.

Conclusion: How to Apply AI

Use AI to support the analysis and decision process, being careful to engage users at every stage, banish jargon, explain that algorithms are performing simple tasks, using lots of data and millions of calculations. Use visual analytics to help people to engage with the data and explore underlying causal effects. Explain that data and judgement are the key success factors in applying AI.

Positive experience with AI, “in its place”, in its assistive role, creates confidence and experienced people with their unparalleled feel for their business can quickly come to see where AI could create further opportunities for their firm – and they will be able to oversee the task.

Using Analytics to Monitor Profitability, Cash & Risk in a Portfolio of Projects

Using Analytics to Monitor Profitability, Cash & Risk in a Portfolio of Projects

A Portfolio of Project Risks

Organizations that undertake many projects experience variance in the performance of each project and so adopt a portfolio approach to managing risk. The variance of the performance of the portfolio of projects will be lower than the variance of any individual project.

The risks of the portfolio must be monitored to ensure that the performance of the portfolio is staying within predicted parameters. Early warnings of issues help management to make early interventions to either bring the portfolio back into balance or to increase the contingency provisions or reserve buffers.

Transfer of Risks to Outsourcing and Contracting Firms

In outsourcing and contracting, clients seek to transfer risk to contractors because they have the best skills to manage project risks and the transfer provides them with the economic incentive to manage risk well. The contractor spreads the risk over a portfolio of projects for multiple clients and keeps contingency cash reserves that can absorb the potential variance in the performance of the portfolio.

The portfolio approach assumes that the risks of individual projects are not correlated. Monitoring will reveal where this is not the case. For example, a contractor’s portfolio might show a pattern of overly aggressive bidding and this will result in a series of projects with correlated high levels of risk that will need the contractor to maintain significantly higher levels of reserves.

Failure to manage risks correctly and maintain correct reserves can lead to losses that exceed the cash reserves, so contractors can go out of business. A failure to monitor the portfolio risk is damaging for the contractor and the client alike.

Example Case: A Construction Company Client

The client was a Canadian construction firm with a portfolio of projects across western Canada. Rapid growth meant that the top management now needed automated reports and analytics to help them to stay on top of the profitability and risk of their portfolio of projects. The growth rate meant increasing revenues but also a growth in the use of cash for working capital and for the risk reserves.

The Data, Data Warehouse and Data Model in SQL Server

We sourced the information for the analysis from accounting, direct labour and operational systems. To bring this information together, we set up automated data feeds from these systems into a data warehouse in the Microsoft Cloud using SQL server.

We designed a data model that would make all the data, calculations and predictions required for monitoring easily accessible to business users.

Dashboards in Tableau

We developed dashboards in Tableau that gave visual insights into the performance of the portfolio. These gave prompt and regular updates on the actual and projected performance of the portfolio, with drill down into project detail.

This meant that the management could quickly find when projects were not performing as expected, rapidly assess the implications, review the causes, and decide on prompt interventions to address the issues.

Management could manage cash reserves closely, returning cash to shareholders that was clearly excess to requirements as working capital and risk reserve, so improving the return on capital employed.

The Broader Implications; Analytics as a Competitive Advantage

This industry sector is one in which the ability of the supplier to manage and absorb risk is of clear value to the customer. Were the contractor to fail to absorb the risk and then fail as a business then the disruption for the customers could be severe. A contracting firm’s skill at managing risk is one of the criteria customers should use to evaluate a bidder.

Demonstrating the ability to monitor and manage risks and reserves could thus be important competitively. The alternative of the supplier keeping large cash reserves to cover any eventuality would require prices which are unattractive to customers or make such a business very unattractive to investors.

Therefore, the ability to monitor and manage risk with early targeted action to address and correct problems early is valuable for all outsourcers, contractors and for their clients. Analytics is a low-cost way to make the firm both more competitive and more robust. Firms without these analytics and monitoring capabilities run the risk of being less competitive, less attractive to investors, and more prone to failure.

Features of Azure for Analytics and Data Science

Features of Azure for Analytics and Data Science

The Azure cloud platform provides powerful, yet easy to use cloud-based tools for data transformation, data analytics and data science. These tools offer a lot of flexibility to Azure developers and cater to a variety of skill levels. This blog provides a brief run through of some of the features of the Azure tools for data analysts and data scientists.

For further information, or if you have more questions you’d like to discuss, please feel free to contact us at contact@oxianalytics.xyz and we’d be happy to assist you.

Azure Options for Data Analytics

Azure Synapse Analytics

Azure Synapse Analytics is the next generation of the Azure SQL Data Warehouse. It lets users load many data sources from relational and non-relational databases at a single given instance. These databases can reside locally or in the Azure cloud. The data is unified, processed, and analyzed using SQL. The Azure Synapse Studio acts as a workspace for data analysis and AI tasks.

Azure Databricks

Azure Databricks is an analytical service tool based on Apache Spark. Large datasets can be processed quickly. Databricks supports several languages such as Java, Python, Scala, SQL, as well as libraries such as PyTorch and TensorFlow. Spark data can be integrated with any of these languages and frameworks.

Databricks also offers integration with Azure Machine Learning giving an Azure developer access to hundreds of pre-determined machine learning algorithms. It minimises the complexity of setting up a Spark data center locally through auto-termination and auto-scaling.

Azure Data Factory

Azure Data Factory is an ETL (Extract Transform Load) service, used for processing structured data at scale. An ETL process extracts the data from various data sources, cleans and transforms the data and converts it into a format that is suitable for analysis. Data Factory helps users build ETL flows using a visual editor without code.

There are over 90 built-in connectors to common data sources, including BigQuery, S3 and many others. Data can also be effortlessly copied from Data Factory to Azure File Storage.

Azure Stream Analytics

Azure Stream Analytics is a real-time analytics service. It provides you an end-to-end bridge for streaming processes based on serverless technology. A data analytics pipeline is defined for streaming the data and data processes are defined through SQL syntax. The processing can scale up dynamically depending on the throughput and volume of the data. It also offers built-in recovery and machine learning capabilities.

Azure Stream Analytics lets you add Power BI as an output, and this allows Azure developers to visualise those data streams in real-time in the Power BI Service.

Data Lake Analytics

Data Lake Analytics is an on-demand analytics job service that helps in simplifying big data. It dynamically provisions resources and the system can automatically scale up or down as required. Using Azure Data Lake Analytics, you can perform data transformations using a variety of languages such as Python C#.Net, and SQL as well as others. Data Lake Analytics connects to other Azure data sources like Azure Data Lake Storage and performs data analytics on-the-go. An advantage is that it is a cost-effective solution for running big data workloads.

Azure Analysis Services

Azure Analysis Services can fetch data from multiple sources and build a single semantic model for processing. This model can help you develop high-end business intelligence service solutions with security and reduced delivery time. Analysis Services is highly scalable, and it is possible to import existing tabular models or SQL tables into the system.

Azure Tools for Data Science

Azure Machine Learning Studio

Azure Machine Learning Studio is a cloud-based drag and drop, collaborative platform where users of varying skill levels can build, deploy and test machine learning solutions, either using a no-code designer or built-in Jupyter notebooks for a code-first experience. It provides automated machine learning which aids both professional and non-professional data scientists to build ML models rapidly. Pre-configured machine learning algorithms and management modules of Azure data warehouses are also available, meaning Azure ML Studio ideal for any data scientist looking to efficiently research a machine learning models’ performance.

Azure ML Studio offers built-in integrations with other Azure services such as Data Bricks, Data Lake Storage etc. It also supports open-source frameworks and languages like MLFlow, Kubeflow, PyTorch, TensorFlow, Python and R.

Azure Cognitive Services

Azure Cognitive Services offers a selection of pre-built ML and AI models. It can be used by any developer and doesn’t require machine-learning expertise. Some of the features of cognitive services are:

  • Image processing algorithms that can identify, index and caption your images.
  • Speech recognition algorithms that can convert audio into readable and searchable text. Integrate real-time speech translation into your apps. Allows for voice verification based on audio.
  • Mapping of complex data for semantic search and smart recommendations.
  • Allowing apps to process natural language with pre-built commands to understand the user’s needs. It also can detect and translate more than 60 supported languages.
  • Access to billions of web pages, images through a single call by adding Search API’s to your apps. Enables safe, ad-free search and advanced features like video search on your app.

Data Science Virtual Machine

Azure Data Science Virtual Machine (DSVM) is a virtual machine with pre-configured data science tools. ML solutions can be developed in a pre-designed environment. DSVM can be the ideal environment for data scientists to learn and compare machine learning tools as it is a complete development setting for ML on the Azure platform. Some of the data science tools in DSVM are data platforms, ML and AI tools, data visualization tools and development tools.

The DSVM environment can significantly reduce time to install, troubleshoot and manage data science frameworks. You can use it to evaluate or learn new data science tools. For more information on the full list of tools included with DSVM, click here.

Sustainability Analytics – How to Define Sustainability Metrics

Sustainability Analytics – How to Define Sustainability Metrics

Introduction

There are five key communities with an interest in sustainability; consumers, firms, investors, institutions, and governments. Consumer attitudes are hardening, firms and investors are scrambling to get measures in place, intergovernmental bodies are taking initiatives, politicians are trying to reach a consensus. The efforts of these communities are commendable. But the analytics for sustainability are in danger of letting them down. Few of the metrics stand up to scrutiny. Firms need to critically review the metrics that put forward to assess their performance, they need to be ready to influence and lobby to get metrics that are valid. Sustainability issues will inevitably affect firms – their funding, tariffs, taxes, brands, and access to markets.

Sustainability Targets

It looks like we will have a climate disaster on our hands. The world’s sustainability and global warming targets will be missed. We will see increases in the frequency of catastrophic events. For some people in some locations, it is already too late. With the disaster will come unpredictable political fallout. The groundswell of popular support for the sustainability agenda suggests people’s priorities have shifted.

Sustainability Metrics

Analytics and metrics are playing a role. Governments use them to direct incentives such as funding, tax breaks and other forms of state support and subsidy for innovation in sustainability. Investors and lenders use them to move money out of industries, firms, products and brands that may be seen to be part of the problem, so increasing their costs of capital and reducing their returns.

Sustainability Data

There is more and more data available to investors, lenders and large and small firms alike. Governments and research institutions provide relevant data and metrics as open data, freely available. One aim is to encourage firms compete to provide innovative products and services that will be part of the sustainability revolution.

Sustainability Analytics and Business Models

The European Union is pursuing initiatives aimed at accelerating this evolution of metrics in order to support regulatory interventions. The European Central Bank will use them to direct its market interventions to support the European Green agenda. Trade negotiators will use tariffs to target imports from firms and countries that are seen not to be supporting the sustainability agenda. Firms may find that regulatory changes will undermine their business models with higher taxes and tariffs, and even stop them accessing certain markets.

Some firms are actively trying to transform their energy mix and their business models. They hold a conviction that a business model based on sustainable growth is a guarantee of quality and performance that contributes to the resilience of their brands.  Certain consumer segments, while perhaps not willing to pay premiums for brands that support the sustainability agenda are beginning to shun brands and products that do not.

Sustainability Analytics and Dashboards

Not to include sustainability analytics within your wider analytics agenda would seem to be perverse. That said, metrics are still work in progress. Providers of ESG ratings (Environmental Social and Governance) to investors can’t get their ratings to align. Regulators have not yet standardised metrics across jurisdictions.

All firms need to invest some time in developing their use of metrics on sustainability issues. It will help firms identify investments and assets that risk being “stranded” by regulatory changes. It will help firms to attract funding and subsidy that will accelerate their growth. It will help firms to communicate their sustainability goals and progress to their own employees and customers. Firms would do well to include sustainability metrics into their performance dashboards – demonstrating a commitment to the “sustainability revolution” to all stakeholders.

Five Sustainability Issues and Innovation in Sustainability Metrics

There is still room for innovation in metrics, but the five sustainability activities to be addressed by firms are very clear:

  • climate change mitigation and adaptation
  • sustainable use and protection of water and marine resources
  • transition to a circular economy, including waste prevention and increasing the uptake of secondary raw materials
  • pollution prevention and control 
  • protection and restoration of biodiversity and ecosystems.

Conclusion

Assessing and measuring a firm’s contribution to addressing all these dimensions of sustainability will become more urgent as the impacts of the current missed climate targets become clearer to everyone, not least consumers.

Real-time Analytics and Power BI

Real-time Analytics and Power BI

Real-time Analytics is a way of analysing the data as soon as it’s generated. Data is processed as it arrives and the business gets insights delivered without any delay.

Real-time Analytics is useful when you are looking to build analytics and reporting that you need to respond to quickly. It’s how we ensure the analysis is updated with the latest available data, when that data updates constantly. It’s particularly useful in what we class at Oxi Analytics as “sense and respond” analytical use cases.

These sense and respond use cases are usually found where small but quick changes in a process will make a significant impact on the result, where risks are trying to be minimised or where we need to quickly identify changing patterns to avert serious damage to an area of our business, such as sudden and unexpected changes in customer behaviour. They are found across various business verticals and industry sectors, particularly in brand monitoring, digital marketing, manufacturing etc.

Power BI’s real-time analytics features are used by organisations across the world such as TransAlta, Piraeus Bank S.A, Intelliscape.io etc., You can read more about these use cases here.

Power BI delivers real-time Analytics capabilities with its real-time streaming features. Let’s explore this more to learn about its capabilities in depth and also importantly, it’s limitations.

Real-time Streaming in Power BI

Real-time streaming allows you to stream data and update dashboards in real-time. Any visual or dashboard that makes use of a real-time streaming dataset in Power BI can display and update real-time data.

Types of Real-time datasets

There are three types of Power BI real-time streaming datasets designed for displaying real-time data:

  1. Push datasets
  2. Streaming datasets
  3. PubNub streaming datasets

Only the Push dataset allows historical data to be stored. If we want to build real-time analytical reports which show the historic data as well as the latest changes, we need to use the Push dataset. The other two datasets, Streaming datasets and PubNub streaming data sets are used when we want to create dashboard tiles to showcase only the latest data point.

Here’s a table listing the major differences between all three datasets. You can find more information here.

Capability

Push

Streaming

PubNub

Update Dashboard tiles in real-time

Yes.

 Allowed with visuals created via reports and then pinned to dashboard

Yes.

Allowed for custom streaming tiles added directly to the dashboard

Yes.

For custom streaming tiles added directly to the dashboard

Data stored permanently in Power BI for historic analysis

Yes

No, it’s only stored temporarily for an hour.

No

Ability to build reports atop the data

Yes

No

No

 

Push Dataset

This is a special case of a streaming dataset. While creating a streaming dataset in Power BI, if the ‘Historic data Analysis’ option is enabled, it results in a Push dataset. Once this dataset is created, the Power BI service automatically creates a new database to store the data.

Reports can be built on these datasets like any other dataset. Power BI doesn’t allow for any transformations to be performed on this dataset and it cannot be combined with other data sources either. However, it allows for adding measures to the existing table. Data can also be deleted using the REST API call.

New Streaming Dataset
New Streaming Dataset

Streaming Dataset

Data gets pushed here as well but there’s an important difference. Power BI stores the data in a temporary cache, which quickly expires. The temporary cache can only be used to display visuals, which have some transient sense of history, such as a line chart that has a time window of one hour.

Since there’s no underlying database created, you cannot build reports using this data. Also, you cannot make use of report functionality such as filtering, custom visuals etc.

The only way to visualise this data is by creating a dashboard and adding a tile with “Custom Streaming Data” under the Real-Time Data section.

Real time data
Real time data

PubNub Dataset

With this dataset, the Power BI web client uses the PubNub SDK to read an existing PubNub data stream. No data is stored by the Power BI service. PubNub is a third-party data service.

As with a streaming dataset, there is no underlying database in Power BI, so you cannot build report visuals against the data that flows in, and cannot take advantage of the other report functionalities. It can only be visualised by adding a tile to the dashboard, and configuring a PubNub data stream as the source.

Any web or mobile application which uses the PubNub platform for real-time data streaming could be used here.

Streaming dataset
Streaming dataset

In general, when using a custom streaming dashboard tile, you can choose from five different visualisation types as shown in the below screenshot. These tiles, when added to the dashboard, will have a lightning bolt icon at the top left corner, indicating that they are real-time data tiles.

Streaming dataset

How to Choose a Dataset Type?

A Push dataset can be used when historic data analysis and building reports atop the dataset are crucial. The dataset can be created using the API option in the streaming dataset UI. It can be connected using either Power BI service or the Power BI Desktop.

A Push dataset can also be created using the ‘Azure Stream’ option. However, the dataset created using this method can only store a maximum of 200,000 rows. After hitting the limit, rows are dropped in a FIFO (first-in, first-out) fashion.

If the idea is to have dashboard tiles displaying pre-aggregated live data using simple visuals, then a streaming dataset is the perfect choice. This can only be connected using the Power BI service.

PubNub datasets are used when the data is generated using the PubNub data stream. Tiles created using this dataset are optimized for displaying real-time data with very little latency.

I hope that helped in answering some of the questions on real-time analytics using Power BI. Please feel free to contact me through contact@oxianalytics.xyz if you have any further questions I can help with.

Digital Strategy and Analytics in Pharmaceuticals

Digital Strategy and Analytics in Pharmaceuticals

One reason for the success of big Pharma is superior prowess in sales and marketing. Once R&D (Research & Development) has delivered a new patented innovation with global market potential, the marketing and sales activities promote rapid global adoption and shorten the time to peak sales within the patent life.

Big Pharma has long dominated the available sales and marketing channels. This position of strategic positional advantage has been stable, kept in place by a combination of conservative regulation and big Pharma’s satisfaction with the status quo.

Current trends are threatening this stability. So, how might things change? In this blog I will explore how this instability could disrupt the status quo and create a new basis for competition. All players would be wise to take account of instability and the new possibilities that could open-up.

For example, Covid 19 has propelled more innovation in digital practices across healthcare in the last three months than in the previous three years. At the same time, Covid 19 has disrupted Pharma’s promotion model. Polling of healthcare professionals suggests that there is every prospect that these practices and the disruption will persist. One possibility that opens-up is that digital strategies could become increasingly important.

Healthcare professionals say they need help with digital innovation and new digital practices post Covid. The need for digital re-skilling and learning across the healthcare sector is significant. The healthcare professions need help with funding, digital skills, and digital tools.

Pharma has ample skilled professionals with empathy for the caring professions and that are skilled in communicating new concepts and practices as well as in building relationships. Pharma could respond to these new needs and new opportunities by re-deploying its resources to help everyone transition to the new faster-moving, real-time digital world. Pharma resources could switch to communicating and helping prescribing health professionals via digital conversations on digital channels.

Regulators would need to play their part. They have long constrained the ways that Pharma firms communicate and build relationships with healthcare professionals. However, regulators are interested in promoting consumer and patient welfare, healthcare innovation, and competition. All of which could be improved by a move to using digital channels. Regulators are always open to discussion and negotiation. They have and will look favourably on new digital practices that help patients, healthcare practitioners and pharmaceutical companies if those practices are open and competitive.

Regulations differ by legal jurisdiction. For this reason, firms have long delegated responsibility for local relationships from the global to the local market level. Locally is where the digital conversations are handled. Local and regional is where discussions with regulators are conducted. If they choose to take the initiative, big Pharma could choose to orchestrate those local and regional regulatory discussions now.

New digital channels will be harder for big Pharma to dominate by any other means than superior marketing and relationship building competence. Digital channels will allow smaller players, with a compelling story for a niche audience to cut through and get their message across. They will obtain insights that they use both to shape their messaging and target unmet needs. While individually these incursions into the larger firms’ market will be small, collectively they will add up.

Local market teams in big Pharma will be taking on their smaller local market competitors head to head on a newly-level playing field. Local marketing teams will engage their local medical professionals digitally on topics that are locally relevant. They will use local digital marketing applications that support digital conversations, information sharing and relationship building.

This scenario would be a competitive marketplace, but there is one sustainable advantage that big Pharma could create by using data and analytics. This is a global database that could be a rich source of insights into healthcare professionals’ evolving priorities. Used for new insights this could be a long-term competitive advantage in customer understanding and segmentation.

To create this intelligence advantage, big Pharma firms could merge the local data obtained from local conversations captured by digital analytics applications with data from providers, payors, patient advocacy groups, web-based medical sites, third-party data brokers, channel partners and open data from government and inter-governmental sources. Once collected, structured, and mined for insights it could restore a source of strategic advantage from global scale and reach in marketing and sales to big Pharma. Technically these databases are not difficult to create using scalable cloud technologies such as Microsoft Azure and other cloud vendors in combination with enterprise-grade visual data analytics tools.

Digital Strategy as a Core Competence

Digital could become an important competence for all firms that wish to engage with healthcare professionals. Firms will keep competency high only with continuous improvement processes and continuous learning – ideally based on digital analytics. This is the way they will adapt to the new era of Covid resilient business practices. It could open big Pharma up locally to competition from digital-first niche competitors. While smaller businesses will not have the resources of the large pharmaceutical companies, digital is perfectly suited to niche businesses and local data and analysis is equally available to all players. Firms of whatever size can craft a digital strategy which is right for them, their healthcare system, and their patients.

Conclusion

It is likely that the recent disruptive changes mean that all players in the pharmaceutical and healthcare sector will need a new strategy for digital. Large firms have both the advantages and the challenges of scale, but small players can easily access the affordable services and the data and analytics they need to compete in niche markets. Digital strategies, widely adopted, could supply the impetus for innovation and productivity across the healthcare sector.