Building applications for the cloud

Building applications for the cloud

In traditional application development the focus is on preventing the system from failing.  In technical terms: increasing the Mean Time Between Failures (MTBF).  Applications build for the cloud require a different mindset.  One must expect failure to happen (every now and then) and focus on how to recover from them.  The focus should shift to Mean Time To Recovery (MTTR).

The background

Distributed systems are complex by nature.  They connect a broad set of components which all interact to form one solution.  Failure in one component can trigger a cascade of issues.  Some of these components might even be external services (e.g. payment services).  These might or will most likely not be under your control.  They might be temporary unavailable. 

Mitigation strategies

Some resilience features are built-in the cloud services you will be using.  Some other pre-cautions need to be added.  There’s 2 of those that are crucial.  Monitoring and automation are key capabilities for a cloud workload.  By monitoring a workload, you can trigger notification and tracking of failures.  And automation should allow for automated recovery processes to work around or repair the failure.

Monitoring of a cloud environment is not something you want to do by workload.  Of course, every workload should be monitored.  But you need to have a consistent monitoring setup across workloads.  It not just monitoring of “an application”, but it’s monitoring of the whole environment.  You need a foundation with your specific monitoring requirements. 

Resilience as a function of cost

The technical design considerations are only one part of the story.  Probably more important is the conversation with the business about availability requirements.  Users expect today a 24/7 availability.  There’s an implicit expectation that by moving workloads to the cloud, they will get 24/7 availability out of the box. It’s crucial to have that conversation during the initial phase, before starting to transition a new workload.  How much downtime is acceptable?  How much would this downtime cost to the business?  How much is the business willing to invest to make their application highly available?

Summary

Designing and building a cloud application offer a significant set of advantages but requires different strategies compared to building on-premise applications.  There’s a mindset shift between the concept of going from Mean Time Between Failures to Mean Time To Recovery.  This shift has both technical as business implications that need to be properly addressed.

Proof of Concepts in a Cloud world

Proof of Concepts in a Cloud world

We often get questions like (don’t worry if the question sound like Chinese for now) :

“I want to setup a SharePoint farm in Azure.  Microsoft says using a SQL managed instance is supported instead of using a SQL Server.  Would you recommend this ?”

Let’s translate this to :

“I want to use a specific cloud service to support my cloud application.  Would you recommend this ?”

To answer this question, let’s go back in time …  Over 20 years ago, one of my first large projects was to build a solution for a logistics company that involved a very complex planning module.  It turned out to be a custom development that would run in “client-server” mode.  Architecture these days was relatively straight forward : a database server box, windows clients and some networking to stitch it all together.  Hardware had to be ordered on time.  Imagine somebody would have come up with the idea : why don’t we order a UNIX server with an Oracle 8i database, a Windows server with an Oracle8i database and a Windows server with a SQL Server 7 and see what works best …  The answer would have been along the lines of “have you lost your mind ?” …

Oracle 8 was released in June 1997.  The next version Oracle 9i was only available in 2001.  In a cloud world,  service updates are done as a never ending stream of small incremental updates.  Not just for the the core services are being updated in this way, software development is done in exactly the same way.   Amazon engineers deploy code every 11.7 seconds, on average.  Netflix engineers deploy code thousands of times per day.

So the reality is you can no longer turn to a software provider to ask : “Will this or this scenario work for me ?”   The good news is trying it out is fast and cheap.  It does require a different mindset, a different organisation and a different approach.

Different mindset

Traditional on premise budgeting is often done by “counting boxes”.  How many do we have, how many do need to be replaced, …  Counting boxes doesn’t work in a cloud world.  In a cloud world you are using services.  Some services are not by coincidence called “serverless”.  A cloud solution is a combination of services.  One solution can be designed in an endless combination of different types of services.  An architecture is a very different game from 20 years ago.  Architects will select services based on business requirements.  And they will have to test it by means of … a proof of concept.  Cloud architects will get their hands “dirty” to try things out.  No hardware needs to be ordered, nor installed.  They take a separate cloud instance and setup what they need to try out.

Different Organisation

A traditional IT organisation with development, infrastructure and operations is not a good idea in a cloud environment.  That in itself is another story.  Looking at it from a Proof of Concept point of view, the small proof of concept teams must be able to get the POC up and running without organisational boundaries.  Often this is even a “one man” effort to set it up.

Different Approach

In a larger enterprise environment, the last thing you want is that over 100 or more people are spinning up their own proof of concepts without any structure.  That is not the way to go.  Part of a Cloud Center of Excellence setup is a workstream called “Reference Architectures”.  It’s not a typo to have “architectureS”.  In an enterprise you will have more than one set of reference architectures. To keep it simple : a security reference architecture, a database reference architecture, virtualisation reference architectures, …

It is within these workstreams that technical proof of concepts are being executed.  But the idea of proof of concepts is not limited to technical explorations.  To take advantage of cloud benefits, business proof of concepts are even more important.  Using technical building blocks, business ideas will be tried out.  Without major upfront investments a solution can be setup, tried out and if needed shut down again.  Or when proven successful, expanded into a next iteration.

Using Proof of Concepts in a cloud world has nothing to do with random trial and error.  The technology allows to test things out, technical and business related, without major upfront investments.

What is a Cloud Center of Excellence ?

What is a Cloud Center of Excellence ?

According to Gartner, every enterprise needs one.  According to conversations with our customers, every enterprise wants one.  But what exactly is a Cloud Center of Excellence (CCoE)?  In 2016, Stephen Orban, at that time Global Head of Enterprise Strategy at AWS, started using the term CCoE in a blog series around the “Enterprise Cloud Journey“.  His vision for a CCoE at that time was similar to what today would be a vision for a DevOps team with a little more focus on change management.  That original vision is sometimes still causing confusion: “is a CCoE not another name for DevOps?”.  The short answer is that the original vision is still relevant: you should have a team responsible for developing a framework for cloud operations, the governance around it and for building out best practices throughout the business.  However, just like the cloud have evolved a lot since 2016, the definition of a CCoE has moved on as well.  DevOps still is one of the capabilities a CCoE will focus on, but there is much more.  Already in 2017 AWS had expanded the original definition:

“A Cloud Center of Excellence is a cross-functional team of people responsible for developing and managing the cloud strategy, governance, and best practices that the rest of the organization can leverage to transform the business using the cloud.” 

AWS, Cloud Management Report 2017

How we see it at 45 Degrees a partner of Nimbuz:

A Cloud Center of Excellence is a centralised program with the goal to optimise the cloud-enabled IT transformation with focus on 6 key capabilities: Strategy, People, Governance, Platform, Security and Operations.

Enough of definitions.  Everybody will have their own definition.  Let us go into what it really is.

It is a program.

When you bring people together in a cross-functional way it is a good idea to put a program structure behind it.  Hence, we do not start from “it is a group of people”, but rather, “it is a program”.  The program needs a program charter, which defines what the program is all about.  Building this program charter is best done in an iterative way.  Just like adopting cloud technology, a big bang approach will not work.  Start simple, version 1 of the program charter.  Work in an agile way even with your program definition.  Which does not mean it changes every day but work in defined iterations.  Expand the scope as the maturity grows.  Calling it a program and running it as a program will also establish trust with senior leadership.  It will help to get buy-in.

There are people in the program.

Of course, the program is staffed by people.  But how many and what roles?  And where do they come from?

First things first: outsourcing a CCoE is not the way to go.  The CCoE is a program you must own and run yourself.  You can add staffing of external experts.  To start of you might even need to find the new skills you need to start a CCoE.  Our pragmatic recommendation is to staff iteration one of your CCoE with at least 2 own resources: a CCoE Lead and a CCoE Architect.  The CCoE lead takes control of the CCoE program and of the non-technical capabilities.  The CCoE architect takes control of the technical capabilities.  By keeping the core team small, they must bring other people into their workstreams.  That is critical for the success of a CCoE.  As the maturity grows you can add more people to the CCoE.  Always keeping in mind to be lean and mean. 

A question we often get is: but I do not have people with these skills in house?  Which obviously makes sense.  We see 2 ways of dealing with that scenario.  Or you appoint 2 people with a keen interest and with the potential to learn to operate in a cloud world and you put external consultants next to them and let them work as a tandem.  Or you hire them externally.  The second scenario is not without risks.  Finding this type of people is one challenge.  But more difficult is onboarding them.  One of the key skillsets of people in the CCoE is that they can bring people from different skillsets and organisations together to work on a common goal.  For new people it takes time to understand the organisation.  The second scenario mainly has a timing challenge.

It is centralised.

The CCoE is a centralised function at the crossroad of central IT, business IT and cloud service consumers in the business.  A successful CCoE serves as a broker or a business partner.  In the traditional view of IT, the IT division is often an operations unit or an abstraction layer between the business and IT assets.  With a well running CCoE the focus is on delegated responsibility.  We like to call it “freedom within the frame”.  Microsoft has a nice analogy image showing the difference between the 2 approaches:

A misunderstanding we often hear is that customers assumed a CCoE will “run all of the cloud”.  Whatever it means exactly, the CCoE is not there:

  • To run all the cloud (migration) projects within the CCoE.  These projects will be done in project teams, which will use various best practices established by the CCoE workstreams.
  • To run all the cloud operations.  The CCoE will define best practices for topics related to operations.  But the CCoE is not the team that will do the operations.

People in these project/operations teams will participate in workstreams owned by the CCoE.  That is critical for the success of the CCoE and as such of the cloud transformation.

It is to optimise.

Cloud computing is marketed as “easy to adopt”.  Start with a proof of concept or a pilot, the investment cost is very low.  Fail fast, learn fast.  It took Thomas Edison over 9000 attempts to invent the light bulb.  His advantage was that he was not stressed by time…  Reality is that cloud adoption is complex, and the adoption time is not unlimited, quite the opposite.  A cloud center of excellence is there to optimise the cloud enabled transformation.  Optimising has a lot of faces. An interesting example is “cost optimisation”.  Recently we got a lot of questions around “the cost of the cloud”.  Let us be a bit provocative:

  • If you do a simple lift and shift of your on-premises datacenter to the cloud, it will most likely be more expensive.
  • If you just put a proof-of-concept workload in production as is, consumption costs might be very high.
  • If every development team can pick and choose what cloud services they will use, your cloud services will trigger a big bill each month.

The topic of optimising cost in a cloud world is one of the many topics where a CCoE plays a key role.  The list of topics is long.  As such it requires a framework to give you a checklist of capabilities required to optimise your cloud transformation.

It has a broad scope: the 6 main capabilities.

And that brings us to these capabilities.  At 45 Degrees we have structured our CCoE framework around 6 capabilities.  Within each of these 6 capabilities there is a defined set of sub-capabilities which give a good overview of what it takes to run an optimised cloud transformation.  Looking at the scope of such a framework, the need to work in an iterative way becomes clear.  That triggers another interesting question, which is covered in another blog: “How do you start building a CCoE?”. 

How to add Nimbuz as your Digital Partner of Record

dpor
dpor

How to add Nimbuz as your Digital Partner of Record

 

The Partner of Record for your Microsoft cloud subscription(s) is the partner who is helping you design, build, deploy or manage a solution that you’ve built on the service.

Attaching Nimbuz as your DPOR gives our team access to your usage and consumption information, obviously not your company applications and data. There is absolutely no cost to selecting a DPOR. The usage and consumption information enables us to provide a better and proactive service, help you optimize your usage and licensing, and drive better adoption of your Microsoft investment.

Step-by-step instructions to add a Digital Partner of Record to your Office 365, CRM Online, Intune, and Enterprise Mobility Suite

 

  1. Go to the Office Customer Portal at https://portal.office.com/
  2. In the Microsoft 365 admin center, go to the Billing > Your products page.
  3. On the Products tab, select the subscription that you want to edit.
  4. On the subscription details page, under Partner information, type 5253004. That is the Nimbuz NV partner ID.
  5. After you assign Nimbuz as your Partner of Record, we will receive an email notification that lets us know that we have been assigned as Digital Partner of Record.

 

Step-by-step instructions to add a Digital Partner of Record to your Azure subscription.

 

  1. Go to the Microsoft Azure portal at http://azure.microsoft.com/
  2. Click on the My Account icon on the upper middle of the screen.
  3. Click on Usage and Billing.
  4. Log into your account using your user name and password.
  5. Once you’ve signed in, click on Subscriptions to manage your subscriptions. Select your subscription.
  6. On the Summary Subscription Page, click on Partner Information on the right navigation. This is where you will attach your Partner of Record.
  7. Enter partner ID 5253004 to designate Nimbuz as your POR.
  8. Click Check ID to see the name of the partner. Verify you have selected Nimbuz, and click the check box to completed assigning your Partner of Record.

After you assign a Partner of Record, Nimbuz will receive an email notification that lets us know that you have assigned, changed or removed your Partner of Record.

Ready for the future with Big Data and Artificial Intelligence

Ready for the future with Big Data and Artificial Intelligence

For companies and organizations today, everything revolves around data – especially how they process that data. New buzzwords and software technologies emerge in this field every year. The trick is to find the right balance between impulsively embracing each new trend and waiting just the right amount of time before springing into action. If you jump aboard too early, you might be taking too much of a risk – and you’ll most likely be reacting to a hype. Those who wait too long risk missing the boat and are overtaken by the  competition. Blindly responding to every new trend and technology is, therefore, not the right way to go.

It’s much more important to develop the right mindset: the knowledge that every single company and organization, both now and even more so in the future, is also a digital company.

Only with that vision will you succeed in bringing people, data and processes together. Because it’s not about software technology per se, but rather the doors that this technology can open. If you have the right data mindset, business processes will run more efficiently, you will gain insights into market developments, strengthen your HR policy and better understand consumer behavior. Ultimately, this increases a company’s customer satisfaction level and competitiveness. With the right digital vision, you will also be able to evaluate trends, experiment quickly and create added value for your organization, employees, customers and partners in all kinds of ways. And that is crucial, because the world will continue to change dramatically in the coming years, and more and more data will be generated.

In this blog, we zoom in on big data:

  • What opportunities can big data open up for your organization?
  • How can data be optimally processed and analyzed – and even become trends and predictions?
  • What’s the best way to approach this? And what mindset and expertise are required?

The answers to all these and many more questions will help your organization navigate its way through a technological shift, tailored to your corporate culture. A switch to more data-driven decisions, processes and entrepreneurship. In short: a shift towards an optimal digital transformation.

What can companies learn from big data?

Online clothing and shoe retailer Zalando, for example, knows that its web shop is visited 2.5 billion times a year. That 2.76 per cent of those visits lead to purchases. That an average of 64.5 euros is spent on each  purchase. And that a customer makes about 3.9 purchases per year. With this strategic data, Zalando can find out who their “ideal” customer is, which products they should offer to that customer, and at which point in time they can do that. This is just one of the many business examples where it is beyond doubt that big data will become increasingly important for companies in the coming years.

This explosion of generated and stored data has not simply happened by chance. In recent years, memory storage devices have become increasingly larger, cheaper and, thanks to the cloud, more accessible and easier. The challenge for companies now is to put all this data to use in an intelligent way; to transform data from a passively stored by-product into an active policy instrument. Some people call data the new gold. Nathan Bijnens and Wesley Backelant agree. As Cloud Solution Architects at Microsoft, they are both experts in the field of smart data processing. ‘These days, every company has a huge amount of data  sources’, says Bijnens, who helps the EU and the public sector, among others, to do more with their data. ‘So, it makes absolute sense and is a good idea to look for ways to do something meaningful with this data. The data comes from everywhere; from classic sources such as ERP and CRM systems, databases and Excel files, but also from newer applications such as Internet or Things (IoT) sensors, control systems within production units, click and weblog data etc.’

Dream team: big data and AI

Each company has its own data sources, but simply collecting and keeping track of all that data does not provide any direct added value. This only happens when you process the data in a smart way: individually per data source, or in a group. There are no ready-made solutions for this. Something like this must be customized to suit the needs of each company. What is the same for all companies is that data is growing faster worldwide than people can be trained and called upon to analyses it. In addition, there are many data sources that are only useful when they are processed in real time: data from social media, for example. Therefore, algorithms are indispensable. They are an invaluable part of artificial intelligence (AI). An algorithm is a series of instructions that lead you from a starting point to an end point. Without those instructions, machine learning (techniques with which computers can learn, analyses and even make predictions themselves), for example, would not exist.

‘Lately we have seen a great deal of innovations in the field of smart algorithms’, says Backelant, who helps industrial and automotive customers with their data processing. ‘These algorithms are becoming increasingly accessible and are also increasingly able to combine and/or span different data sources and computer  applications. Also important is the fact that AI has recently made it possible to process certain data sources that until a few years ago were unusable for distilling useful data. This includes photos, video images and speech, which can be analyzed automatically. Today, for example, English speech can even be recorded and understood by machines with a margin of error that is as small as if two people were talking to each other. This so-called human parity, in which computers understand spoken languages just like people, will only increase in the coming years. As I said, this is already happening flawlessly in English. But in the future, this software will not only be able to correctly register and understand more languages, but also accents and pronunciations.’

Know your customer / Know your company

The smart handling and analysis of different data sources provides added value for every company, regardless of the sector in which it operates. First and foremost, companies that really focus on their data get a much clearer and better insight into their customers, according to Bijnens. ‘Companies that better understand their customers can also respond more efficiently to their needs – and in a more personal way. Those who achieve that insight into their customers thanks to a targeted data analysis have an extremely powerful policy instrument at their fingertips. Data insight not only provides knowledge, but also a competitive advantage. It is no coincidence that most companies that challenge or even disrupt the market today are data driven. Just look at Netflix and Facebook. Companies that understand their customers through data can make the right decisions, faster.’

But there’s more. Smart data analyses can also be used to improve and optimize business and production  processes. Take AB InBev’s SmartBarley platform for example, Wesley Backelant explains. ‘Barley is an essential ingredient for beer production. The brewery’s digital solution brings together the knowledge of barley growers to create value for their harvests and the supply chain. By having access to crop data collected from all over the world, farmers can learn from each other’s experiences, exchange good practices and learn about new techniques. This way, they acquire new insights and learn from each other. This ultimately results in barley of a higher and more consistent quality. This reduces costs: the harvests are better. But it is also better for the environment because fewer pesticides need to be used. And of course, better barley is also an added value when brewing beer.’

It’s clear that the analysis of big data is so much more than just a technological tale. Analyzing data purely for the sake of it is of very little use. The key is to set goals. As a company, you do this by looking for ways in which data can improve the organization, says Ken Geeraerts. He is BI Team Lead at Kohera, which assists companies in the smart conversion of data into insights. ‘You first have to make sure that the data is stored in the best possible way and then made available centrally. Then you look for ways to convert that data into a policy instrument; that helps you make targeted and strategic decisions.’

In short: stored data first needs to be activated. This then results in certain interpretations and insights, and from these insights data-driven decisions can then be made. Every company benefits in one way or another from this transformation process. For example, Kohera created a cloud solution for the Kinepolis Group. The cinema giant uses this solution to find out why customers go to the cinema. From there, the service can be even better tailored to the customer, and new products can be launched. In other words: the solution ensures that customers are better understood and served.

Proof of concept: technology is not the end goal

‘When implementing methods to transform data into new information, it’s important to start small’, says Nathan Bijnens. ‘The most important thing for a company is that you get started! Of course, you don’t have to start with a huge project right away: start small. Work on a proof of concept that also provides added value in the short term. Don’t think too theoretically about goals you want to achieve in five years’ time, for which you first need two years of preparation. If you start today, you will start learning lessons from the whole process and build up experience. The feedback that results from such an initial experiment is also invaluable. If you analyse data to be able to make better decisions, or to predict things, it is important that you can measure the impact of that process.’

In this respect, the most successful data companies now are setting a good example: it is no coincidence that each and every one of them has built well-thought-out feedback loops into their data processing. ‘A problem that often recurs in practice at a number of companies is that they do not or hardly want to set aside a budget for data experiments’, Wesley Backelant observes. ‘For this purpose, as well, small start-up projects are an ideal solution: they are the first step on the road to developing the right data  mindset for a business. And with their direct results they can also pave the way for larger projects at a later stage: at departmental or company level. It is equally as important to accept that such an experiment can also fail or not produce the desired results. But that, too, is an excellent learning process. And when done on a small scale, it’s not a disaster either: everything can be adjusted fairly quickly and easily. That used to be different. Until a few years ago, AI projects were inherently gigantic. For example, I know a company that has spent years collecting data using all kinds of sensors. A large AI project recently failed, resulting in a lot of time and money wasted. If that company had set up a small experiment at an early stage, they would  probably not have had that problem: certainly not on that scale.’

It always makes sense to start small, in both the short and long term. Even within a period of just two weeks, you can achieve a lot with such an experiment, says Nathan Bijnens from experience. ‘You quickly get a pretty good picture of whether you’re on the right track, and whether the approach can be extended to achieve the desired results. This means that just about any medium-sized or large company can start such a big data and AI process without virtually any risks. I think it’s actually riskier not to do such experiments, or to  postpone them. How should you get started? Start by identifying the type of data that is generated within the company, and then determine what you want to get out of it. Then set up a pilot project to see how the desired results can be achieved, or how they definitely cannot be achieved. This approach really works.  We see that every day with our customers.’

Someone who shares the same opinion is Danny Otten, sales manager at Nimbuz. ‘Building a pyramid that lasts for centuries is a very slow process and it costs a lot of time and money. Go for good enough: a good tent can be put up in no time and lasts long enough. That way you also keep the budgets under control.’

Kohera often kicks off such a process with a workshop, Ken Geeraerts explains. ‘We spend two days identifying the business needs and challenges and how they can be solved. We then select the most urgent topics so that we can come up with a proof of concept. We can then build on that later.’ Danny Otten of Nimbuz also often uses a similar approach. ‘We listen to our clients, and then inspire them. We do this by showing them examples of data solutions that we have built or designed for other clients. From there, we create a project that might also benefit them, and we formulate an initial proof of concept: within a budget-friendly, realistic timeframe.’ The chances of success of an initial data experiment depend, of course, on the complexity of the issue in question. For example, a test with a chatbot system, for which several concrete solutions already exist, will rarely end in failure these days. Today, there are numerous AI models with a proven track record of strength and efficiency, and that were developed by both commercial international companies and as open-source solutions available free of charge. These models can be processed in a very accessible way and embedded in the data flows of companies, says Wesley Backelant. ‘Suppose a company wants to analyse social media channels. In doing so, it wants to go further than simply counting the number of likes and reactions. It also wants to map the content of those reactions or images automatically, while of course respecting privacy regulations. Today there are tested models that can be used straight away. And those models are also becoming increasingly accessible. Moreover, they can be personalised and scaled, tailored to the processes for which companies want to use them. This also saves time and costs.’

The crystal ball: AI as a predictor

Machine learning can also be used to predict certain trends, evolutions, needs or actions. That, too, will only increase in the coming years. Nathan Bijnens: ‘Predictive maintenance is a good example of this. They are predictive maintenance techniques, usually via IoT sensors, which help determine the condition of equipment. This makes it possible to estimate the best moment to carry out maintenance. This saves costs compared to routine or time-based preventive maintenance, because tasks are only carried out when they are justified. Or advice can be issued automatically, in the field of production quality control, for example. There are also possibilities for marketing companies to predict the number of clicks on a link. Energy companies can anticipate electricity consumption patterns. Fintech companies use these models to proactively trade on expected stock prices. Today, for example, many investment transactions are already fully automated: to save both time and costs. Farmers can use satellite images and weather forecasts to optimise their harvests. NGOs can predict the time and location of outbreaks of disease and epidemics – and thus use these models to save lives, for a better world. The same applies within healthcare, where, for example, certain cancers or other conditions can be predicted via smart data processing. Or AI that uses radiological images to detect conditions at an early stage, which are not yet visible to the human eye. In short: the possibilities for using AI predictions are truly endless.’ Danny Otten of Nimbuz mentions even more examples where data predictions can help companies move forward. ‘We currently have a project in the preparatory phase for a drinks producer. They want to see a correlation between changes in the weather and which drinks are sold the most in which weather type. They can use that information to adapt their production: in warm weather, consumers drink different things than when it is cold, for example. Based on weather forecasts, the producer ultimately wants to fine-tune its purchasing of raw materials and production processes. This will help them optimise their production capacity. Even if they achieve only a 2% improvement here, this will still be an immense return. Especially when the project is rolled out internationally later on. Simply put: the possibilities for using analyses and predictions based on data are almost limitless.’

Without the cloud, there’s no AI

The cloud is a big enabler of AI. A company that wants to start a short-term experiment does not have the time to first develop an entire big data infrastructure for it. Thanks to the cloud, you can get started right away, says Wesley Backelant. ‘This means existing solutions – both commercial and open source versions – can be used straight away to tackle a specific data challenge. The cloud makes AI more accessible and cheaper than ever. In fact: without the cloud, AI would probably not exist today.’

“The cloud makes AI more accessible and cheaper than ever.”

Ken Geeraerts of Kohera also believes that the cloud plays an indispensable role. ‘It makes the process more affordable. Developing and implementing a data solution within a company’s physical IT structure often costs many times more than making it available in the cloud. Scalability in the cloud is also easier, more flexible and cheaper. It is also easier for a company to “quickly” place a gigantic volume of data in the cloud and then run the AI model, which is then properly trained, and from which the right conclusions can be drawn. You can build such a test project in the cloud in just two days and then throw it away completely if you need to, with minimal costs, to try something new and better. That’s impossible to do on-premise. Locally installing an AI solution on an organisation’s computers is simply the old-fashioned way of doing things.’

Digital transformation

A fundamental mistake companies sometimes make is to view and treat big data and AI as an IT project. After all, the role of the IT department within a company has changed dramatically in recent years, and  will continue to do so. At best, IT is a partner in this story. A company’s IT department should never be  completely excluded when it comes to big data and AI: after all, they have the best knowledge about the company networks, the data stored in them and how everything is secured. ‘The fact is that companies  have a huge responsibility when they are working with data’, says Wesley Backelant. ‘Security is extremely  important in that regard. A data loss or incident not only causes technical difficulties, but often also  reputational damage. A big data project only works when it can also be executed securely.’

“Companies that are not actively  working with data in a few years’ time will sooner or later become irrelevant,  there’s no doubt about that.”

Some companies have data, but don’t know what to do with it. Other companies don’t even realise that they’re sat on a mountain of useful data and information, explains Danny Otten. ‘When they analyse their clients’ data in a targeted way, they discover what information can be extracted from that data. For example, the number of hours worked on a fixed-price project. And we often see this in production environments as well, where a lot of data is generated by machines and sensors. These data are often isolated from each other. By pooling them and combining them, you generate a lot of new insights. You can then use these,  for example, for more optimal planning of production processes, the purchase of raw materials and  deliveries, and the deployment of personnel. Usable data is everywhere.’

In any case, your big data and AI will be more effectively managed in a business transformation process. You can set up a data analysis department or an innovation lab for that purpose. Or implement it even more extensively, for example, by introducing a data scientist to each business unit. Nathan Bijnens: ’This makes organisations very agile when it comes to responding to changes: they can anticipate them or even innovate in certain areas. Data not only strengthens and accelerates companies, but also makes them more resilient. Companies that are not actively working with data in a few years’ time will sooner or later become irrelevant, there’s no doubt about that. Data analyses are becoming less and less an option, and more of a necessity. Those who don’t do it, will be beaten by players who have done their homework, and therefore have gained an advantage.’

Start today, be ready for tomorrow

The future? General AI: The findings and insights gained through big data can also be applied to other domains. A pre-trained model that can be enriched with your own company data. A company’s advantage will be less and less based on the technology used, and more on the data they process with it. The feedback loop they get from it, and how they strategically use the results to make decisions. Ken Geeraerts of Kohera believes that big data in combination with blockchain and the Internet of Things (IoT) will be able to measure or predict a kind of butterfly effect in a few years’ time. ‘For example, it will be possible to predict that when a customer buys a bottle of milk in shop X, there is a good chance that they will book a trip to Senegal next year. More and more connections will be made that seem impossible today. The options available to companies to improve their activities, strategies and results are truly endless.’

In the coming years, extremely powerful quantum computers will further strengthen and accelerate AI’s role within big data. This will enable more and more real-time analyses and predictions to be made, says Ken Geeraerts. ‘Thanks to quantum computers, data streaming will be capable of streaming analytics. Those computers already exist. And they will probably also become more accessible and affordable in the future. ‘Danny Otten agrees: ‘These powerful quantum computers pave the way for real-time decision-making. In the past, computers used to make calculations; today and certainly tomorrow, they will help to make decisions. ‘AI will certainly be much more accessible, believes Nathan Bijnens: ‘That trend is already picking up steam. Look at chatbots: they already represent a supple collaboration between humans and algorithms. I can also envisage AI being used more in smarter assistance and RPA: Robotic Process Automation, in other words the automation of repetitive actions via software. But also being able to look up and retrieve information is crucial in a data processing strategy. Many of the organisations I work with could no longer see the wood for the trees because of the large number of archives they have created. These organisations can be helped with a much smarter search on their own documents. Gaining access to the right information at the right time accelerates business processes. In this respect, I can see many exciting innovations in terms of text analyses, speech, search terms and chatbots in the coming years.’

“Gaining access to the right information at the right time accelerates business processes.”

When is a digital transformation considered a success?

When the user sees that there is a Return On Investment (ROI), concludes Ken Geeraerts. ‘They will notice this automatically, when they can act and make decisions faster, better and in a more targeted way. And they won’t need to rely as much on their gut feeling, but more on objective and optimally substantiated analytical data, to which they have access via a clear, comprehensible dashboard.’ A digital transformation should not be a project; it should form part of the company’s overall strategy. Danny Otten: ‘Companies should develop a compatible vision for this. You can’t draw up a plan for the next three years, because this is a dynamic process. But as a company, what you can do is make sure that everything you do originates from out-of-the-box thinking. That is how you develop a general vision. Otherwise they will remain separate projects and will never form a coherent whole. As a company, you will have frittered away a lot of potential.’

In addition, actually initiating a digital transformation – getting started in the first place – is more important than the end goal, thinks Nathan Bijnens. ‘It is vital that you are ready to make the transformation. But a digital transformation is, by its very nature, never complete. As a company you have to keep continuously evolving, just like the markets in which you operate. The end goal of a digital transformation is to create an organisation that is constantly changing. One that is ready for change, and can respond to that change in the best way possible. It can’t do that with technology alone. It also needs to have the right mindset, so that it understands that next year’s business world may be completely different from this year’s. It should be able to anticipate that development by analysing and even predicting data. Companies can only ever achieve all those things by making a start. Preferably sooner rather than later.

10 Opportunities for a modern workplace

10 Opportunities for a modern workplace

Are you stretching personal productivity to its limits?  Your people are now mobile, networked, always-on.  Their work-life balance has shifted, for better or for worse.  In return people start consciously protecting their valuable  private time. They’d be happy to work smarter, not just harder. Technology is just an enabler. If you use it the old way,  it’s not likely to provide new value. It requires you to change your workplace habits. Below are ten typical situations where modern workplace technology could make a more efficient, more involved and more productive workforce. At the same time there’s an opportunity for your organization to evolve from perimeter-based defense to granular, cloud enabled,  identity centric security. 

 

1 – People take their laptops everywhere 

Work is wherever the job takes us. At the airport, at a customer’s site, at home. We need information and applications at our fingertips. So, we make local copies of important documents, or try to link back to the company network. Imagine the trouble when a laptop gets stolen, or damaged. If only we would have secure cloud services allowing us to work from any device. 

2 – Security is perceived a showstopper 

Do your people like to try new things? Like putting up a live demo on the fly. Maybe they would. Yet do they have the proper device, all the applications and a link back to your network? If the setup is  a headache, they wouldn’t do it in front of a customer. If only you would have a secure cloud solution. 

3 – GDPR compliance is so important 

GDPR was a tough exercise. Now you have carefully documented when, why, which collaborator should  have access to personal information. GDPR compliance however, is not a one-time effort. With proper tools, automated assessments and digital identity solutions it’s easy to stay compliant. 

4 – We’re e-mailing draft documents using tracked changes 

Old habits die hard. ‘Tracked changes’ are great for reviewing text-based documents, if the number of  versions and contributors are limited. Co-authoring however, requires a different mindset and toolset.  Do your people draft, edit and comment online in real-time? 

5 – E-mail folders for keeping an archive 

Sorting e-mails by sender, subject or follow-up status was a good practice. Long ago. Keeping incoming messages as an instant to-do-list was easy. So was retrieving important old messages within seconds.  Now people can’t keep up adding subfolders, reading and deleting. They can’t remember the last time  their inbox was empty. They need 21st century messaging and collaboration. 

– People like to multitask, yet easily get distracted 

Experts say we don’t actually multitask. Humans switch back and forth between tasks, often fatiguing their minds and killing productivity. Doing e-mails over a group call makes it hard to focus. On each. Yet it’s  perfectly possible to work with many on the same subject using voice, chat, video and online collaboration. 

7 – Binders with important documents to be signed 

Are your executives still signing contracts, orders and operational documents on paper? Do they take time to read those properly? Digital workflows let you sign documents on screen, speeding up the approval process, at your pace, with full traceability. 

8 – Recurrent meetings for following up on action items 

Nothing’s wrong with recurrent meetings. As long as they make sense. Online or face-to-face meetings should be about team participation, motivation, strategy and collective decision making. No need to  physically get together to discuss, do whiteboards or make efficient dispatching. And online tools take  care of listing up action items and continuous status reporting. 

9 – People take meeting notes, yet never use them  

Taking notes is important. At the next recurrent meeting the notes are approved or amended. It’s a standard procedure. Sensitive information is typically left off-record. Action items are mostly done  

10 – If their contact person is out, customers need to call in later 

Personal contact is key. If clients call in, they might hope to hear a familiar voice. Yet mostly they look for instant information. Can any person take the call? Do business applications provide for a single source of real-time truth? Or do people stick to Post-it-notes? And ask to call in later? 

The Smart Workplace in 2030

The Smart Workplace in 2030

How to empower the next wave of productivity and innovation?

Organizations and individuals struggle to understand the impact of today’s changes on our future professional life. Technology is becoming more pervasive. By 2030 most of our workforce will be digital natives. They literally can’t remember what life was like before the internet and the cell phone. The race for their talent, productivity and innovation will be about embracing change, investing in tools and offering them compatible management practices. 

Technology can be overwhelming. And it’s just getting more so. The younger generation might be at ease chatting with ten people simultaneously. Still we all get at some point saturated by too much information. Too many loose ends. Too many open tabs. 

Do we all need to become even more ICT literate? Do we all need hacking skills? If we’re going to thrive using the next wave of productivity enhancing tools, we’d first need to change our mindset. We need to let go. And let technology handle technology. 

Workplace innovation shouldn’t be about adding new ways of working on top of the old ones. When new tools and methods add complexity, people won’t let go their old ways. It’s time to stop adding bells and whistles and to start thinking about simplicity. Less is more, especially in ICT. 

 

Information is everywhere Abundant interfaces on the go 

The ‘pc’ once revolutionized computing. Many things have changed. Yet most of our ICT driven work is  still done in the same static setting involving a table, chair, screen, keyboard and mouse. It’s efficient for  individual productivity. Slightly less for collaboration and co-creation. When cell phones and the internet came along, people learned to work anywhere, at any time. Now we change our place of work regularly,  yet not as much our way of working. 

The smartphone has partially replaced the pc. Especially in the private space. Now there’s an app for  everything. Yet at work we stick to our old computer. It’s lighter, with longer battery life and… with the  familiar screen, keyboard and mouse interface. 

Many new form factors came around. Tablets make comfortable reading. Smartwatches give you updates on the go. Companies like Microsoft offer innovative interfaces through gesture or speech with Surface, Xbox  Kinect, Cortana and HoloLens. The slow adoption for some of these is testimony to the force of collective habit. And to some technological limitations of today. 

Virtual Reality can offer a very intense immersive experience, even adding artificial scent. New business  applications will likely rather come from Augmented Reality and Mixed Reality, allowing to add information layers in daily life – through connected cars, smart glasses, ambient intelligence or wearables. For different types of activities, we could use well adapted devices instead of using our computer or smartphone as a  digital Swiss knife. 

Cloud and identity services will soon take the complexity out of the interface abundance. Only when  identities, applications and sessions can seamlessly be transferred from one device to another, you can really choose which device works best at any given moment. Things become simple if we can dissociate digital identities, devices, applications and data. By 2030 switching devices will be easy. We’ll probably have dozens of devices – per worker, per desk, per type of activity. Maybe people would still carry around a  personal device at all times. Yet it would likely not be as bulky as a laptop. 

 

A view from the Clouds: new ways of sharing and retrieving information 

Most people save and retrieve information based on the device, drive or folder where they decided to store it – hopefully with a meaningful filename, description or nametag. It’s today’s digital analogy of an old paper archive. We’re putting so many files in so many folders. It becomes hard to remember what went where. 

Today people increasingly use search technology to retrieve the files they’re looking for. Sometimes it’s  hard to retrieve the last version among many almost identical ones. Even harder it is to retrieve information within your organization, of which you’re not the author. Here’s the paradox. The more information your organization produces, the more difficult it becomes to rapidly find what you’re looking for. And there’s no value in information you can’t find. 

Another issue is the confidentiality and lifecycle of information you’d like to share. E-mailing documents to people inside or outside your organization, means losing control. Duplicate versions will co-exist and get passed around. Secure file-share locations are slightly better. Yet the access and the security parameters are still pre-defined. The ‘safe zone’ or ‘perimeter’ based security model makes it hard to instantly share  information in a secure yet flexible way. 

Using Cloud technology, it somehow stops being relevant ‘where’ you put your files. It becomes more important to think about discoverability and access rights. Metadata could help you find the most relevant document, the most recent version, the finalized and approved one. Yet workers and teams might decide to limit discoverability by default to themselves only. This could reinforce information silos within the  organization. And finding information will still be painfully hard. 

Organizations should classify information based on roles, identities and activities. This moves away from perimeter-based security. Security should be multi-layered and flexible, if it doesn’t want to become the roadblock to agile business initiatives. Conversations are an intuitive way. In a modern workplace you deal with access rights by simply pulling someone into a conversation. To talk or to share, is an instant disclosure decision. Information security should be able to follow and enforce that decision. And have people thinking more about confidentiality, and less about technology. 

 

Your brain on steroids making sense of information through Artificial Intelligence 

More information makes things harder to understand. When looking for a needle in a haystack, it doesn’t help adding more hay. In 2030 there will be loads of information around about your activities, your projects, your interactions. Commercial companies will be logging and correlating their customers’ profiles and  purchase history. Everything they do. Collaborators will come and go. Teams will be highly dynamic.  So how can you still make sense of the heap of information you’re sitting on? 

Artificial Intelligence (AI) will help your understanding, in two different ways. Firstly AI helps computers to keep up with human interaction. It’s sometimes hard to get our expectations right. When someone hands you a business card, you’d spontaneously be able to identify the street address, e-mail and phone number. Basic AI means a computer can now do the same. 

Is this trivial? Not from a technical point of view. Is it useless? Not when you have dozens of business cards to treat. Basic AI helps you interact with computers in a more natural way. It is already present today in our smartphone’s digital assistant, in online recommendation engines, in Office 365 Delve. Natural Language  Processing (NLP) helps a computer to understand spoken language. It uses our human logic, instead of adapting our thinking to a computer’s very structured input & output. 

By 2030 searching information will be a more natural thing – based on text input, natural langue and  contextual info. Digital assistants, chatbots and spoken language will become very natural forms of user interaction. Contact centers could look completely different. Humans could provide follow-up for special situations while computers efficiently provide automated answers to frequently asked questions. 

AI helps improving data quality by easily retrieving nearly identical information. It helps putting a wealth of personalized information at your fingertips – whether you are a professional, customer or end-user. In some sectors natural digital interfaces will make the difference in customer satisfaction. 

Artificial Intelligence would still need us to adapt to technology, yet in a different way. It would take a  mental effort to let go. To let a car park itself, instead of firmly holding control of the steering wheel.  Author Fredo Desmet pleads for us to consciously give away some control to technology, and to embrace our ‘artificial stupidity’*. 

Secondly, strong Artificial Intelligence is a very different thing. Instead of focusing on user interaction, it helps strategic decision making. Strong AI will help you define trend baselines from ‘big data’ collections and rapidly detect changing conditions. It helps smart enterprises to react quickly, through data-driven decision making. 

Strong AI is typically developed at boardroom level first. Yet by 2030 organizations might use it at any  level – to automate routine decision making, and to offer better insight for workers in the field. In the future AI tools will become increasingly visual, enabling rapid yet well informed business decisions. 

It takes some maturity as an organization to empower workers with AI powered insights. If done right, doing so will make your organization’s strategy very transparent and actionable. AI will help putting all noses in the same direction. If done badly, AI could foster endless discussions about which strategy to take. Empowering workers with strategic insight should be done within an organizational culture of responsibility and trust. 

Before even considering strong AI for decision support, organizations should develop a big data strategy. ‘Garbage in, garbage out’ most certainly applies for business analytics. Don’t you have consistent, high  quality data sets? Then don’t expect consistent, high quality answers. You might need time to develop and train predictive algorithms. And everything starts with reliable, manageable source data. 

When personal data are used for AI-driven analytics and instant decision making, your customers will notice. They need to understand and agree. Their privacy is precious. If an organization is not transparent about it, sooner or later it’ll lead to fierce reactions – loss of customer confidence, reputation damage or worse. In the European Union the General Data Protection Regulation (GDPR) provides guidelines, and serious sanctions, for proper privacy protection. 

By 2030 people will likely be even more privacy aware, probably after some privacy breaches or scandals have touched them personally. They’d accept to give detailed information about themselves when it allows them to get personalized services that better fit their needs. Yet they’d likely no longer accept handing over personal information with no questions asked. It’s important to provide transparent, easy-to-understand information and granular privacy choices. 

Lastly the use of strong AI could lead to serious ethical questions. These might touch the fundamentals of your organization’s mission. Not everything what’s technically possible, is also desirable, acceptable and  sustainable. Whether you are in a for-profit or a non-profit organization, clear guidelines should be put in place. 

 

The power of digital conversations: empowering fluid collaboration 

Imagine your workforce behaving as one body. Imagine expertise fluently flowing through all parts of your organization. You could call in someone with specific skills on the spot, have one or two questions answered and both pass on to the next issue. Actually, you could already do so today using group chat, video chat and advanced collaboration. 

Think about the possibilities of having freelance collaborators and external experts joining in seamlessly, instantly. It could extend the notions of your ‘workforce’ and ‘organization’ to a highly dynamic ecosystem. Formal recruitment and HR would still exist. You would offer a broad range of possibilities from full-time  employment, to temporary project lead, or expert-on-call. Technology and new business practices would allow for hybrid careers. Workers would be able to quickly adjust their work-life-balance – for example to  assume their role as a parent or take care of others. Obviously the legal and fiscal frameworks might  differ from one country to the other. Typically, legislation has a hard time keeping up with technology driven change. Yet due to the scarcity of expertise and the expectations of the millennial generation, this will be the battle ground to the next ‘war for talent’. 

Some expect large enterprises to invest in outstanding campuses, as a hotbed of innovation and organizational culture*. They would be an inspiring, attractive environment for workers, temporary workers, suppliers and business partners. Working there would be considered a privilege, with work and private life blending together. Work would be rewarding in many more ways than just financially. 

Imagine your collaboration environment to be a knowledge network. You’d no longer be looking for a  precise person with the right expertise to become available. Someone in your organization, or in your  business ecosystem, might be a perfect fit. You’d simply need to know who. 

The next generation knowledge management will not just instantly retrieve information. Also skills and  expertise. You just specify what kind of expertise you’re looking for, and your collaboration system should find the perfect match. Artificial Intelligence will search personal profiles, yet also documents, chats,  conversations and online meetings. Organizations will be pleasantly surprised about the untapped skills and knowledge they already have within their workforce. 

Imagine a CEO delivering a speech or taking on tough negotiations. Through Mixed Presence she or he could have a team of analysts, managers and experts participating off-site. The team could do research and fact-checking on-the-fly. They could flag false statements, suggest detailed questions and provide updated information through Augmented Reality interfaces. They could coach and closely monitor non-verbal  reactions from other participants. 

Information would literally be at one’s fingertips. Human Intelligence would blend with Artificial Intelligence into a versatile thinking combo. Decision making would be greatly accelerated, based on facts and expertise, rather than gut feeling.  

Imagine imagination. If personal enthusiasm and out-of-the-box thinking could meet with the right tools and business knowledge, innovative new ideas could be developed faster, and better. Modern workplace tools would greatly benefit early go-to-market initiatives. Testing crazy ideas in real life. It would help large  organizations foster a true entrepreneurial spirit, now typical for smaller scale start-ups. It would help  eliminating procedures, functional silo’s and all the complexity that usually comes with scale. 

The modern workplace will be an environment where information technology blends in perfectly, so we’d stop thinking about anything technical. ICT tools should become as intuitive as riding a bicycle. After a short learning curve, we should totally shift our focus back to work – chatting, meeting people, drafting proposals, presenting, convincing and delivering. 

Some futurists argue information technology is nearing the end of its dominance. Could something else replace ICT as the next major driver of economic development? Just like the steam engine, steel, electricity or motorized transport? If by 2030 information will be as abundantly available as electricity or water, we might think of it as a commodity. Yet powerful, simple and lightweight information management will define how well you’ll be prepared for the next big thing. 

Microsoft Teams for Socialistische Mutualiteiten

Microsoft Teams for Socialistische Mutualiteiten

How the worst of times can bring out the best in people …

Before COVID-19 arrived, all employees of Socialistische Mutualiteiten typically worked in the office. Due to the sudden lockdown, everyone had to start working from home. A Skype for Business platform was in place but did not allow to communicate externally. SocMut faced a big challenge: how to rapidly deploy a solution that allowed all employees to connect to the outside world including audio and video capabilities. And how to make this solution available on SocMut laptops but also the personal devices of the employees. 

SocMut engaged with our colleagues from The Flow Consulting and asked their help to get Microsoft Teams deployed. To get speed in the deployment, the decision was made to not integrate with the existing Skype for Business setup. 

Within 1 week they were able to provide it to 5000 users. Daily, on average 1600 users collaborate and communicate via Teams, 600 1 to 1 calls are completed, and 22000 chat messages are exchanged. 

In order to also get the users familiar with this new technology, manuals were created on how to installe MS Teams on any device. All documentation and manuals were centralised on a website. Intense communication had a positive impact on user adoption. This approach resulted in no impact on the Service Desk i.e. only a limited number of additional calls per day. 

The initial need was a platform to communicate via text, voice and video. In the meantime, SocMut is looking into new scenario’s such as MS Teams Live Meetings and virtual consults. 

Do you have questions about how to bring your workplace environment to the next level ? 

Contact us at info@nimbuz.be 

Griffith Foods wins 10% of production capacity with AI-driven scheduling tool

Griffith Foods wins 10% of production capacity with AI-driven scheduling tool

Griffith Foods is an American product developer of customized food ingredients. We got in contact with the team at the production facility in Herentals, that  asked us to develop an intelligent production sequencing tool to maximize productivity on the factory lines.

Our team developed a tool that optimizes the cleaning schedule of their production lines to regain production capacity. Thanks to the tool, the number of so-called wet cleanings (30 – 90 mins. standstill of the line) decreased by 5%. This way, 17 production days are won (in just one production facility) on an annual basis. In addition, they make their site greener by saving 1000 m³ of water per year and are using less detergents. All of this has contributed to a payback period of just 10 months.

 

An answer to our continuous growth

Griffith Foods produces all kinds of customized food products for their customers. Rising demand led to a search for innovative solutions and tools to boost production capacity. The cleaning of the production lines quickly came into view, says Oscar Sluiter, Senior Director Global Supply Chain at the company: “To guarantee the quality of each product and to ensure, for example, that there is no unwanted transfer of allergens, color, taste and other product properties take place, the lines have to be cleaned between the production batches. Absolutely necessary, but on average we lose about 165 production days per year, across all lines. the idea came to see if we could reduce the number of wet cleanings, because they are not needed between each batch. Sometimes a faster, dry cleaning is sufficient.”

 

The Challenge

The objective of this project was to minimize the cleaning time of the manufacturing equipment between production of different additives. If cleaning takes less time, then there is more time for production, and this way the company can increase their revenue. So far, planning the products on a production line was still quite some manual work, because planning is different every day. So a second objective was also to help the planners in doing their job in a more efficient way.

 

 

High-level problem statement

Not every product is the same. They differ in color, odor, texture and allergens. It’s important to take these differences into account during production because Griffith Foods wants to avoid that products get contaminated by other products in terms of color, allergens, … Since the beginning of production, Griffith Foods ensured the quality through a thorough wet cleaning between each product. This takes between 30 – 90 minutes, each time. And it’s not always necessary. If products are compatible then a dry cleaning is enough. Therefore, Griffith Foods asked Arinti to write an algorithm and create a tool that minimizes the number of wet cleanings based on the differences and equalities between the products. This can save a significant amount of time and effort per day!

“THE COLLABORATION WITH ARINTI IS GOING WELL AND THE TOOL IS WORKING FINE. THAT TRANSLATES TO A ROLL-OUT ACROSS ALL OUR SITES. CURRENTLY, TWO OTHER FACTORIES IN EUROPE ARE ALREADY PLANNED FOR THE NEXT TEST. ULTIMATELY, ALL OUR 21 LOCATIONS ARE PLANNED.” – OSCAR SLUITER, SENIOR DIRECTOR GLOBAL SUPPLY CHAIN

 

Difficult parameters

It was decided to use algorithms to determine more accurately when cleaning with water and detergents was required. Arinti was called in to develop the algorithms, fine-tune them after several testing phases and develop the user interface. The Griffith Foods site in Herentals would become the home of this pilot project. “It was not an easy job,” Oscar Sluiter continues. “A number of iterations have been done to see what works best. We first started from a model that works based on allergens, odor, taste, color per recipe, among other things. Some of these parameters turned out to be difficult to quantify, so we switched to use the ingredient lists with data from our ERP system as a basis for the algorithms.”

 

Technical solution – Sorting optimization

An algorithm was written according to a logic provided by Griffith Foods. Basically, we matched products together based on their equalities. For example: if two products are yellow, contain small flakes and contain fish as an allergen, then they are compatible and can be produced after each other with a dry wash in between. However, if  the next product doesn’t contain fish, then we have to schedule a wet wash between the product that contains fish, and the product that doesn’t contain fish. Basically, the algorithm looks at the combinations possible and starts building sequences of products that only need dry washes in between. Of course these sequences need to be glued together with wet washes. And that’s how we come to a daily plan for production on the different machines in the plant.

 

 

Algorithms save time, water and detergents

Choosing to focus on the ingredients turned out to be the right way forward. After training the models and developing the front-end, the planners at Griffith Foods received a simple, useful tool that allowed them to optimize cleanings. They also use to tool to assess the impact on the rest of the planning when an urgent order has to be squeezed between the normal schedule. “It used to take many hours to calculate all cleanings for a day without compromising the quality of our products,” explains Oscar Sluiter. “Now that happens in a few minutes, it is more accurate and the planner gets a clear suggestion. As a result, we already have five percent fewer wet cleanings and we want to increase that to at least ten percent. We have already gained back 17 production days per year with the current version of the tool. In addition, this already saves us 1000 m³ of water every year and we use less detergents, which is great for the environment. That was also an important parameter for a successful project. ”

 

Fluent collaboration leads to international roll-out

The references of Arinti convinced Griffith Foods to work with them. “Arinti did not just dump a lot of technology into our organisation. They listened to our demands and developed a tool that works for us. Our planners find the tool easy to use. In addition, there is an app that helps our employees control which batches are produced on which machines and that we can use to change dependencies ourselves when databases change location. That the collaboration with Arinti and the tool they developed works great, translates to further roll-out plans for all our sites. Currently there are two other factories within Europe on our schedule for the next test. We’ll use their findings to improve the logic of the algorithms and to increase the time savings possible even further. Ultimately, all our 21 locations around the globe will be using this tool, “concludes Sluiter.

Is Microsoft 365 worth the investment?

Is Microsoft 365 worth the investment?

I spent the last 10 years of my career focusing on Microsoft cloud services. At first the Business Productivity Online Suite a.k.a. BPOS, afterwards Microsoft 365 and Azure. Fascinating technologies, real game changers. And yes, I really adore the Microsoft cloud and strongly believe the best is yet to come. Call me a believer!

It is fair to say that Microsoft makes it very hard, almost impossible, to remain on premise with their business productivity tools. And that is a good thing … just look at the world we live and work in during the COVID-19 crisis, where digital is the new normal and were businesses who did not move to the cloud often encountered tremendous challenges to collaborate. Microsoft Teams became the standard communication and collaboration tool for lots of organizations in a very short timeframe. Microsoft shared that Microsoft Teams usage exploded with almost 800% in the last month.

No what about the budget you will spend as an organization when you use Microsoft 365? Just multiply the subscription price by the number of users in your organization and calculate the total cost for 3 to 5 years. That’s a lot of budget!  And there is no way to lower that cost since pricing is rather nonnegotiable and completely user based. The only thing you can do is maximize the return of your investment.

I have 2 recommendations for you …

  1. Deploying or adopting to Microsoft 365 is not a technical project. It is a business project! The current Microsoft 365 suite consists out of almost 30 different cloud services. You should take time to investigate what cloud services are relevant and valuable for your end users.
  2. Build your network of Microsoft 365 ambassadors. They will be your voice in the company to drive adoption to a maximum. They will increase the return of your investment They will come up with new use cases of one or more Microsoft 365 cloud services.

Microsoft 365 governance is key. Don’t migrate and consider yourself done. Make Microsoft 365 adoption an infinite process. Define KPI’s, hit them, define new ones. Keep on looking for new use cases, look at ‘MyAnalytics’ to influence the way your end users get their work done, question yourself how you can digitize and automate processes by using Power Apps and Power Automate.

And finally, #staysafe. Microsoft 365 offers you enterprise class security. Traditional physical networks don’t exist anymore: the number of devices exploded, they are not always managed and people connect from everywhere. So please leverage the security features Microsoft offers you. My 2 cents? Microsoft 365 is here to stay! And you will need your end users to make adoption a success. Microsoft 365 governance is an infinite process. Ensure you have got the basics right!

danny.otten@nimbuz.be