Supply Chain Resilience and How Visualisation Tools Can Be Beneficial

Supply Chain Resilience and How Visualisation Tools Can Be Beneficial

How supply chain resilience is used in businesses

The resilience of supply chains has been tested in recent years by some catastrophic events and by political disruption to trade flows. Supply chain managers have developed some pragmatic approaches to risk management and some useful tools, including the ability to visualise supply chain dynamics.

These tools visualise the flow of material, components, and products from source to destination via nodes showing locations, facilities, and distribution points. The visualisations include data such as inventory levels of components and spare parts at supply chain storage locations and depots.

How visualisation can play a crucial part in supply chain resilience

The visualisations can be filtered by the attributes of the flow, such as, product categories, products, brands, component subsets, sub-assemblies, and SKUs. The visualisations allow for the flow of specific locations and facilities to be seen and understood by a time frame. The visualisations are automatically refreshed from digital sources in the supply chain locations and so are always up to date. The visual tools are used by supply chain managers to ensure that the supply chain is resilient to a range of commonplace, but lower impact disruptions. For example, by increasing inventory levels to reflect the risk level. For rarer, but higher-impact events the visualisations will inform the rapid recovery actions that need to be taken. Issues such as climate change seem to be increasing the rate at which events occur that can have a high impact on supply chains. Therefore tools that support the ability to recover from disruption and losses quickly have a high utility.

The analytics tools used include Adobe Analytics, Microsoft Power BI, Tableau and Qlik. The visualisations are not out of the box, but purpose-built to provide the filtering and drilling functionality that is required to understand potentially complex patterns of product and component flows. The calculations are also purpose-built to provide the appropriate calculations of inventory cover, supply chain velocity and working capital requirements. All of which reflect the potential impact of disruptions on cash flows and cash requirements.

The effectiveness of visualisation tools relies on the quality of the data engineering and data modelling on which the visualisations are based,
as well as the power of the filtering and the functionality that permits drill through of supply chain complexity. Visualisation tools provide a means of engaging people and processes in the cost-effective improvement of supply chain resilience.

Contact one of our experts to discuss how supply chain resilience and visualisation tools can benefit your business today

How to Approach AI Opportunities in Small and Medium Sized Firms

How to Approach AI Opportunities in Small and Medium Sized Firms

All firms are exploring AI opportunities and working out how to apply AI to their businesses. The widespread belief that AI will be transformational is behind the recommendation that firms make significant investments of time and money in creating the necessary capabilities. Though undoubtedly valid, this advice has large firms in mind – with substantial resources and the ability to spread risk. Smaller, mid-sized firms need approaches that are cost-sparing, that play to their strengths and that are lower risk.

AI as a Source of Opportunities

Artificial intelligence is a source of opportunities for businesses, small and large. The data necessary can be assembled from a myriad of sources. Small business will need to develop the capabilities to tap into AI effectively and to keep up with change and competition. They need to bring their people up to speed. So, how can they best do that?

AI Courses

There are online-learning platforms, universities, and executive-level programs to train people. Courses differ, but most, delivered by academics, include theoretical material that is hard to make relevant to a specific business situation.

AI Skills

Firms can hire new people equipped with these skills. However, there are reports of culture clashes between academic newcomers and pragmatic, experienced incumbents in large firms. This suggests this is an even higher risk option for smaller firms where there is more reliance on close working relationships and tacit understanding.

AI and Change

An internal “AI Competence Centre or Academy” that is part of an internal change programme is an option proposed for helping firms “transform.” Whatever the benefits, it is a level of commitment unlikely to be cost-effective for a smaller business.

AI for Smaller Businesses

These firms need an approach that plays to their strengths; practical, focussed on company priorities, engaging, hands-on, intuitive and stripped of academic jargon. It is entirely possible to convey AI principles that are relevant to practical managers. After all, AI is merely replicating what practical managers already do, but in a way that requires reams of data and hours of computer time to perform millions of repetitive calculations. It is only feasible because computer processing power is cheap.

Autonomous cars are some way off, autonomous companies are too far off to worry about. Experience and judgement, however, are indispensable and small firms have that in spades.

How to Introduce AI

Our recommendation is to introduce AI as just another management tool. Introduce it for a series of selected value-generating purposes. Keep the managers in control and contributing to all aspects of the work, from exploring data and sourcing data to interpreting the results. They can then use the results from AI to inform their experienced judgements. Engaging businesspeople and using AI in its assistive role is a pragmatic way to use AI in business.
Managers being sceptical and rejecting some AI results is entirely healthy. The literature is full of embarrassing examples of AI techniques producing results that are not replicable. Firms using AI to generate benchmark scores for investors, or for advertisers, produce scores for the same benchmark that differ, leading to commercial disputes. Often data issues will cause a degree of uncertainty in the results, and that is one reason why experienced people need to be involved. Making decisions under uncertainty is what they have been doing for years.

Presenting AI Results Visually

It is important to present AI results in a way that everyone can understand. This is the role of data visualisations. These are intuitive. After viewing a series of visualisations of a data set, most people are readily capable of drawing meaning from them and generating hypotheses about the implications and underlying causal relationships.

Conclusion: How to Apply AI

Use AI to support the analysis and decision process, being careful to engage users at every stage, banish jargon, explain that algorithms are performing simple tasks, using lots of data and millions of calculations. Use visual analytics to help people to engage with the data and explore underlying causal effects. Explain that data and judgement are the key success factors in applying AI.

Positive experience with AI, “in its place”, in its assistive role, creates confidence and experienced people with their unparalleled feel for their business can quickly come to see where AI could create further opportunities for their firm – and they will be able to oversee the task.

Using Analytics to Monitor Profitability, Cash & Risk in a Portfolio of Projects

Using Analytics to Monitor Profitability, Cash & Risk in a Portfolio of Projects

A Portfolio of Project Risks

Organizations that undertake many projects experience variance in the performance of each project and so adopt a portfolio approach to managing risk. The variance of the performance of the portfolio of projects will be lower than the variance of any individual project.

The risks of the portfolio must be monitored to ensure that the performance of the portfolio is staying within predicted parameters. Early warnings of issues help management to make early interventions to either bring the portfolio back into balance or to increase the contingency provisions or reserve buffers.

Transfer of Risks to Outsourcing and Contracting Firms

In outsourcing and contracting, clients seek to transfer risk to contractors because they have the best skills to manage project risks and the transfer provides them with the economic incentive to manage risk well. The contractor spreads the risk over a portfolio of projects for multiple clients and keeps contingency cash reserves that can absorb the potential variance in the performance of the portfolio.

The portfolio approach assumes that the risks of individual projects are not correlated. Monitoring will reveal where this is not the case. For example, a contractor’s portfolio might show a pattern of overly aggressive bidding and this will result in a series of projects with correlated high levels of risk that will need the contractor to maintain significantly higher levels of reserves.

Failure to manage risks correctly and maintain correct reserves can lead to losses that exceed the cash reserves, so contractors can go out of business. A failure to monitor the portfolio risk is damaging for the contractor and the client alike.

Example Case: A Construction Company Client

The client was a Canadian construction firm with a portfolio of projects across western Canada. Rapid growth meant that the top management now needed automated reports and analytics to help them to stay on top of the profitability and risk of their portfolio of projects. The growth rate meant increasing revenues but also a growth in the use of cash for working capital and for the risk reserves.

The Data, Data Warehouse and Data Model in SQL Server

We sourced the information for the analysis from accounting, direct labour and operational systems. To bring this information together, we set up automated data feeds from these systems into a data warehouse in the Microsoft Cloud using SQL server.

We designed a data model that would make all the data, calculations and predictions required for monitoring easily accessible to business users.

Dashboards in Tableau

We developed dashboards in Tableau that gave visual insights into the performance of the portfolio. These gave prompt and regular updates on the actual and projected performance of the portfolio, with drill down into project detail.

This meant that the management could quickly find when projects were not performing as expected, rapidly assess the implications, review the causes, and decide on prompt interventions to address the issues.

Management could manage cash reserves closely, returning cash to shareholders that was clearly excess to requirements as working capital and risk reserve, so improving the return on capital employed.

The Broader Implications; Analytics as a Competitive Advantage

This industry sector is one in which the ability of the supplier to manage and absorb risk is of clear value to the customer. Were the contractor to fail to absorb the risk and then fail as a business then the disruption for the customers could be severe. A contracting firm’s skill at managing risk is one of the criteria customers should use to evaluate a bidder.

Demonstrating the ability to monitor and manage risks and reserves could thus be important competitively. The alternative of the supplier keeping large cash reserves to cover any eventuality would require prices which are unattractive to customers or make such a business very unattractive to investors.

Therefore, the ability to monitor and manage risk with early targeted action to address and correct problems early is valuable for all outsourcers, contractors and for their clients. Analytics is a low-cost way to make the firm both more competitive and more robust. Firms without these analytics and monitoring capabilities run the risk of being less competitive, less attractive to investors, and more prone to failure.

Features of Azure for Analytics and Data Science

Features of Azure for Analytics and Data Science

The Azure cloud platform provides powerful, yet easy to use cloud-based tools for data transformation, data analytics and data science. These tools offer a lot of flexibility to Azure developers and cater to a variety of skill levels. This blog provides a brief run through of some of the features of the Azure tools for data analysts and data scientists.

For further information, or if you have more questions you’d like to discuss, please feel free to contact us at contact@oxianalytics.xyz and we’d be happy to assist you.

Azure Options for Data Analytics

Azure Synapse Analytics

Azure Synapse Analytics is the next generation of the Azure SQL Data Warehouse. It lets users load many data sources from relational and non-relational databases at a single given instance. These databases can reside locally or in the Azure cloud. The data is unified, processed, and analyzed using SQL. The Azure Synapse Studio acts as a workspace for data analysis and AI tasks.

Azure Databricks

Azure Databricks is an analytical service tool based on Apache Spark. Large datasets can be processed quickly. Databricks supports several languages such as Java, Python, Scala, SQL, as well as libraries such as PyTorch and TensorFlow. Spark data can be integrated with any of these languages and frameworks.

Databricks also offers integration with Azure Machine Learning giving an Azure developer access to hundreds of pre-determined machine learning algorithms. It minimises the complexity of setting up a Spark data center locally through auto-termination and auto-scaling.

Azure Data Factory

Azure Data Factory is an ETL (Extract Transform Load) service, used for processing structured data at scale. An ETL process extracts the data from various data sources, cleans and transforms the data and converts it into a format that is suitable for analysis. Data Factory helps users build ETL flows using a visual editor without code.

There are over 90 built-in connectors to common data sources, including BigQuery, S3 and many others. Data can also be effortlessly copied from Data Factory to Azure File Storage.

Azure Stream Analytics

Azure Stream Analytics is a real-time analytics service. It provides you an end-to-end bridge for streaming processes based on serverless technology. A data analytics pipeline is defined for streaming the data and data processes are defined through SQL syntax. The processing can scale up dynamically depending on the throughput and volume of the data. It also offers built-in recovery and machine learning capabilities.

Azure Stream Analytics lets you add Power BI as an output, and this allows Azure developers to visualise those data streams in real-time in the Power BI Service.

Data Lake Analytics

Data Lake Analytics is an on-demand analytics job service that helps in simplifying big data. It dynamically provisions resources and the system can automatically scale up or down as required. Using Azure Data Lake Analytics, you can perform data transformations using a variety of languages such as Python C#.Net, and SQL as well as others. Data Lake Analytics connects to other Azure data sources like Azure Data Lake Storage and performs data analytics on-the-go. An advantage is that it is a cost-effective solution for running big data workloads.

Azure Analysis Services

Azure Analysis Services can fetch data from multiple sources and build a single semantic model for processing. This model can help you develop high-end business intelligence service solutions with security and reduced delivery time. Analysis Services is highly scalable, and it is possible to import existing tabular models or SQL tables into the system.

Azure Tools for Data Science

Azure Machine Learning Studio

Azure Machine Learning Studio is a cloud-based drag and drop, collaborative platform where users of varying skill levels can build, deploy and test machine learning solutions, either using a no-code designer or built-in Jupyter notebooks for a code-first experience. It provides automated machine learning which aids both professional and non-professional data scientists to build ML models rapidly. Pre-configured machine learning algorithms and management modules of Azure data warehouses are also available, meaning Azure ML Studio ideal for any data scientist looking to efficiently research a machine learning models’ performance.

Azure ML Studio offers built-in integrations with other Azure services such as Data Bricks, Data Lake Storage etc. It also supports open-source frameworks and languages like MLFlow, Kubeflow, PyTorch, TensorFlow, Python and R.

Azure Cognitive Services

Azure Cognitive Services offers a selection of pre-built ML and AI models. It can be used by any developer and doesn’t require machine-learning expertise. Some of the features of cognitive services are:

  • Image processing algorithms that can identify, index and caption your images.
  • Speech recognition algorithms that can convert audio into readable and searchable text. Integrate real-time speech translation into your apps. Allows for voice verification based on audio.
  • Mapping of complex data for semantic search and smart recommendations.
  • Allowing apps to process natural language with pre-built commands to understand the user’s needs. It also can detect and translate more than 60 supported languages.
  • Access to billions of web pages, images through a single call by adding Search API’s to your apps. Enables safe, ad-free search and advanced features like video search on your app.

Data Science Virtual Machine

Azure Data Science Virtual Machine (DSVM) is a virtual machine with pre-configured data science tools. ML solutions can be developed in a pre-designed environment. DSVM can be the ideal environment for data scientists to learn and compare machine learning tools as it is a complete development setting for ML on the Azure platform. Some of the data science tools in DSVM are data platforms, ML and AI tools, data visualization tools and development tools.

The DSVM environment can significantly reduce time to install, troubleshoot and manage data science frameworks. You can use it to evaluate or learn new data science tools. For more information on the full list of tools included with DSVM, click here.

Sustainability Analytics – How to Define Sustainability Metrics

Sustainability Analytics – How to Define Sustainability Metrics

Introduction

There are five key communities with an interest in sustainability; consumers, firms, investors, institutions, and governments. Consumer attitudes are hardening, firms and investors are scrambling to get measures in place, intergovernmental bodies are taking initiatives, politicians are trying to reach a consensus. The efforts of these communities are commendable. But the analytics for sustainability are in danger of letting them down. Few of the metrics stand up to scrutiny. Firms need to critically review the metrics that put forward to assess their performance, they need to be ready to influence and lobby to get metrics that are valid. Sustainability issues will inevitably affect firms – their funding, tariffs, taxes, brands, and access to markets.

Sustainability Targets

It looks like we will have a climate disaster on our hands. The world’s sustainability and global warming targets will be missed. We will see increases in the frequency of catastrophic events. For some people in some locations, it is already too late. With the disaster will come unpredictable political fallout. The groundswell of popular support for the sustainability agenda suggests people’s priorities have shifted.

Sustainability Metrics

Analytics and metrics are playing a role. Governments use them to direct incentives such as funding, tax breaks and other forms of state support and subsidy for innovation in sustainability. Investors and lenders use them to move money out of industries, firms, products and brands that may be seen to be part of the problem, so increasing their costs of capital and reducing their returns.

Sustainability Data

There is more and more data available to investors, lenders and large and small firms alike. Governments and research institutions provide relevant data and metrics as open data, freely available. One aim is to encourage firms compete to provide innovative products and services that will be part of the sustainability revolution.

Sustainability Analytics and Business Models

The European Union is pursuing initiatives aimed at accelerating this evolution of metrics in order to support regulatory interventions. The European Central Bank will use them to direct its market interventions to support the European Green agenda. Trade negotiators will use tariffs to target imports from firms and countries that are seen not to be supporting the sustainability agenda. Firms may find that regulatory changes will undermine their business models with higher taxes and tariffs, and even stop them accessing certain markets.

Some firms are actively trying to transform their energy mix and their business models. They hold a conviction that a business model based on sustainable growth is a guarantee of quality and performance that contributes to the resilience of their brands.  Certain consumer segments, while perhaps not willing to pay premiums for brands that support the sustainability agenda are beginning to shun brands and products that do not.

Sustainability Analytics and Dashboards

Not to include sustainability analytics within your wider analytics agenda would seem to be perverse. That said, metrics are still work in progress. Providers of ESG ratings (Environmental Social and Governance) to investors can’t get their ratings to align. Regulators have not yet standardised metrics across jurisdictions.

All firms need to invest some time in developing their use of metrics on sustainability issues. It will help firms identify investments and assets that risk being “stranded” by regulatory changes. It will help firms to attract funding and subsidy that will accelerate their growth. It will help firms to communicate their sustainability goals and progress to their own employees and customers. Firms would do well to include sustainability metrics into their performance dashboards – demonstrating a commitment to the “sustainability revolution” to all stakeholders.

Five Sustainability Issues and Innovation in Sustainability Metrics

There is still room for innovation in metrics, but the five sustainability activities to be addressed by firms are very clear:

  • climate change mitigation and adaptation
  • sustainable use and protection of water and marine resources
  • transition to a circular economy, including waste prevention and increasing the uptake of secondary raw materials
  • pollution prevention and control 
  • protection and restoration of biodiversity and ecosystems.

Conclusion

Assessing and measuring a firm’s contribution to addressing all these dimensions of sustainability will become more urgent as the impacts of the current missed climate targets become clearer to everyone, not least consumers.