Digital: Why? For whom?

Although digital transformation is clearly on the agenda of all companies, we must ask ourselves the right questions, especially in smaller organizations, so that this project could succeed (read: ‘generate added value or cost savings’). For which purpose and which target should we launch a digital transformation?

According to consulting firm MarketsandMarkets, the digital transformation market was estimated at 290 billion dollars in 2018 and is expected to reach 665 billion dollars in 2023, which corresponds to a cumulative annual growth of 18% over the period concerned. Consultancy firm Valuates Reports states that this market was valued at 330 billion dollars in 2019 and should reach 785 billion dollars in 2026, which corresponds to a cumulative annual growth of 13%.

Although these figures may vary from one consultant to another, it is obvious that digital transformation is a huge market and one of the top priorities; not only of IT directors, but also – and especially – of the general management of companies.

The objectives

Globally speaking, digital transformation aims to “rethink the development of IT systems, their availability and their operating models by strengthening the ability of IT to efficiently collaborate within the company and beyond its traditional limits,” says consultancy firm Deloitte.

Therefore, we must deploy a basic infrastructure on which we can rely in order to deliver automated services, cloud computing and new operational business models. Likewise, we seek to accelerate the provision of IT services and to reduce the risks during this deployment.

The final goal is to transform IT in order to make it more reactive (or even proactive), more flexible and more in line with the rapid evolution of the business needs, without neglecting the inevitable dimension of cost savings.

Consequently, digital transformation implies a strategy aimed at re-evaluating and re-designing all of an organization’s information systems in order to improve efficiency or to offer new innovative services.

In reality, digital transformation is mainly focusing on the following targets: customer experience (allowing to provide the customer with new services and products in digital form), employee experience (particularly in the field of human resources and access to information), process optimization (with more efficient and personalized workflows) and product digitization (dematerialization by means of an agile architecture).

The steps

A digital transformation project is structured around a clear vision as well as an efficient and coherent strategy. The latter must be defined by the company’s general management, after having consulted all businesses departments and IT. The stakeholders must define their needs and specify their ambitions and objectives, after which a precise implementation schedule is to be established. A precise approach in terms of partnerships with the different economic players of the organization (employees, suppliers, customers) will have to be provided.

IT supplier Dell EMC has identified 3 major steps in a digital transformation:

  • modernizing the infrastructure to improve the organization’s efficiency;
  • automating the IT processes and providing service-based IT;
  • transforming the IT processes and operations with a view to more flexibility.

The challenges

Although digital transformation will allow the company to become more reactive, more competitive, more productive and more innovative while reducing its costs, it won’t be easy to make such a project succeed. Indeed, the transformation must be steered at the highest level of the organization, within the framework of a clearly defined vision, while involving the different business entities; not just IT. The company must avoid the silo approach since it concerns a transversal project and involve the different entities of the organization by listening to the needs and wishes of the users. After all, who says ‘transformation’, also says ‘change’. Therefore, we must accept a new approach or structure and integrate it into the existing one. Moreover, (cyber)security must be paid special attention to, while the return on investment is sometimes difficult to quantify very precisely.

In its Digital Radar 2020 report, published this summer, Infosys states: “Too often, I still notice that some CxOs hear a buzzword and immediately want to launch this technology without being clear about how it will help the company to achieve its goals. They underestimate the roles that changes in the field of culture and state of mind will play in the underlying processes.”

The mission of Aprico Consultants is to support companies in their digital transformation projects. This support should be both strategic and technological. In order to support its strategy, Aprico has developed an original work methodology based on three axes: smart, lean and agile. One of Aprico’s key success factors is its transversal approach combining business, technology and methodology; all of this associated with our requirements criteria.

Predictions 2021 – Technology trends for 2021

A new year also means wishes, expressed remotely from now on, due to the coronavirus. It is also the opportunity to lift a corner of the veil on the major technologies which will mark the upcoming year. Therefore, Gartner provides a ‘top of the concepts’ that should hold the CIO’s attention.

Traditionally, the end of the year is the time for Gartner to publish its ‘Top strategic predictions’ for the next few years. In short, it’s a ‘who’s who’ of what will be the daily life of the IT departments and technology providers. It also implies many new buzzwords the users and service providers will have to become familiar with.

2020 has been a very special year, especially because the Covid-19 pandemic has caused a profound disruption of the businesses and global economies, having forced organizations to consider the future from a different – and necessarily new – perspective. “It became clear that businesses need a reset, not only due to the pandemic, but because technological advances demand it,” says Daryl Plummer, distinguished VP analyst at Gartner. “Technologies are being stretched to their limits. Non-traditional approaches will enable the next rebound of innovation and efficiency.”

Buzzwords…

According to Gartner, 3 major trends are emerging: people centricity, location independence and resilient delivery.

With ‘people centricity’, Gartner refers to the user and customer technologies. The Internet of Behaviors (IoB), for instance, aims to combine existing technologies (RFID, sensors, 5G, edge computing) in order to collect and analyze the behavioral data of people, no matter whether they are office employees, citizens or buyers. Moreover, the total experience aims to combine the customer experience, user experience and employee experience in order to provide more comfort during a commercial activity. Finally, strengthening privacy and the protection of personal data is the third axis of the orientation towards the individual (securing the processing and storage of personal data within the framework of the GDPR, in particular).

Furthermore, Gartner thinks that work independent from any location will prevail in the future since it will particularly be stimulated by the coronavirus crisis and by telework. For this purpose, companies will have to implement a distributed cloud, combining public cloud, private cloud and hybrid cloud by means of the emergence of edge computing and of the Internet of Things (IoT). Also within this framework of location independence, the company will have to provide its employees, customers and business partners with tools for the virtual and secure access to the data. The overall leitmotiv will be: ‘Digital first, remote first’. Likewise, Gartner refers to the ‘cybersecurity mesh’, which will allow secure access to the IT infrastructures by creating a security perimeter in the form of an access identity, no matter where the user and the information are located.

Finally, Gartner regroups under the title ‘resilient delivery’ the ability of an organization to be or become resilient. In this context, Gartner speaks of ‘intelligent composable business’. This is an approach allowing companies to face potential disruptions in digital technology, by combining different operational entities which will be able to anticipate the necessary changes and to plan the appropriate reaction in a creative way. Moreover, this resilience must rely on artificial intelligence and governance in order to create high-performance, scalable and reliable models, by integrating AI into the DevOps process, or even into the DevSecOps (see our previous blogs). Furthermore, resilience will require hyper-automation (see also a previous blog) by abandoning legacy applications and platforms and by optimizing the operational processes.

… or real innovation?

Gartner insists on the need to combine these different trends within the framework of a combinatorial innovation, namely “the use of multiple functions, rather than a single technology stack, and the creation of new business capabilities by intelligently and creatively integrating them.”

Therefore, Gartner concludes that the IT departments must combine these technologies in order to create a truly innovative approach which will allow to reduce costs and to create added value for the business, and thus to turn IT into a strategic priority for the organization.

The mission of Aprico Consultants is to support companies in their IT and business transformation projects. This support should be strategic as well as technological. In order to support its strategy, Aprico has developed an original work methodology articulated around 3 axes: smart, lean and agile. Aprico’s transversal approach combining business, technology and methodology, all of this associated with our requirements criteria, is one of its key success factors.

Digital transformation. Will Covid-19 boost digital transformation in the long term?

While the lockdown linked to Covid-19 has forced companies to accelerate telework and the implementation of digital sales channels, most observers see the pandemic as an accelerator of digital transformation.

“I think history will remember this crisis as the moment when the digital transformation of companies and society has suddenly accelerated. Together, we have laid the foundations for the post-Covid world”, IBM’s new CEO Arvind Krishna said during the ‘Think Digital 2020’ event in early May this year.

Consequently, facing the outbreak of the coronavirus pandemic, companies have urgently been deploying not only telework tools for their employees, but also digital channels in order to contact their customers as well as to sell products and services. Jacques Platieau, CEO of IBM Belgium, also noticed it. “During the Covid-19 crisis, it is estimated that 17% of Belgian employees regularly worked remotely, with a majority not doing it more than one day a week. However, the lockdown forced even the most recalcitrant company leaders to adopt telework. They had no choice: business continuity was at stake.” In order to illustrate his point, the CEO of IBM Belgium also mentioned that on March 15th alone, the Zoom videoconferencing service registered no less than 600.000 new subscribers. And its competitors Skype, Slack, WebEx and Teams were also very popular.

Digital processes

But beyond remote access by means of remote conferencing tools, companies have been forced to accelerate the implementation of digital solutions, whether for accessing or sharing documents and information; not only for their employees, but also for their business partners, such as suppliers, customers and prospects. It wasn’t just a matter of deploying new tools, but also of bringing together digital applications which aren’t always interconnected, whether it concerns business processes, production systems – ERP in particular – or administrative solutions, including invoicing, payroll, and more.

Luckily, many companies had already deployed IT infrastructure in the cloud (private, hybrid or public) and were accessing certain applications remotely, in particular by means of SaaS (Software-as-a-Service), while others had relied on an external IT partner within the framework of an outsourcing contract.

Along with the implementation of such digital tools – often in a hurry, with the problems of urgently made choices, of the necessary recalibration of infrastructures and networks which are in high demand from now on, and of security – change management turned out to be particularly tricky, especially given the difficulty of clearly communicating the decisions made. We hope that these efforts will be maintained, or even reinforced, so that companies could become truly digital. “These changes must be accompanied by an internal change in mindset. Of course, the CEO must be aware of this evolution and make it a priority for the organization”, said Jacques Platieau.

It is encouraging that companies have relatively quickly become aware of the possibility of making cost savings by means of the digital technology, a strong argument during this crisis.

Stabilizing and strengthening the acquisitions

Now that the pandemic is taking hold, organizations will have to base their digital transformation initiatives on a real IT strategy and allow themselves to make their ongoing projects sustainable. Moreover, the IT teams must be relieved of the enormous pressure of the crisis in order to stabilize and strengthen the infrastructure, the applications and security. Why wouldn’t they rely on a trusted IT partner for this purpose?

“I think we will see an acceleration in digital transformation because many of us want to be ready to face a second crisis of this kind”, Jacques Platieau concluded. He predicts great activity for the digital transformation industry in the next few months and years.

Aprico Consultants is a consulting company specialized in the field of architecture and transformation of information systems. By resolutely accelerating the digital transformation process, we provide our customers with the flexibility, performance and competitiveness required to allow them to strengthen their market position and to improve their customer service.

Des graines aujourd’hui, pour assurer la vie de demain ! Aprico Consultants maintient sa politique environnementale et ce malgré la crise Covid19

La pandémie, un tremplin vers une entreprise encore plus responsable, et centrée sur l'homme

Force est de constater que le COVID19 nous a tous imposé des changements de fonctionnement que ce soit dans les sphères privées ou professionnelles.

Plus que jamais notre engagement envers notre entreprise mais aussi envers la société au sens large ainsi que nos valeurs personnelles et celle véhiculées au sein de l’entreprise sont les clefs pour nous garantir un avenir durable.

Les entreprises sont perpétuellement mises au défi de démontrer leur créativité et leur pérennité quant à leurs solutions business mais aujourd’hui c’est aussi le cas au niveau de leurs politiques d’engagement sociétal et environnemental.

Chez Aprico Consultants, nous œuvrons ensemble en ce sens depuis des années et nous nous efforçons de semer des graines sur le chemin de l’excellence, non seulement dans le cadre des services que nous offrons à nos clients mais également dans les actions que menons pour réduire notre empreinte écologique.

Grâce à l’engagement conjoint de toutes nos parties prenantes (management, experts, collaborateurs, employés, cadres, partenaires externes, …), les résultats sont là et mesurables.

Avec la collaboration active de notre partenaire CO2 Strategy, nous avons pu réduire notre empreinte environnementale de 16% entre 2018 et 2019 pour afficher aujourd’hui un “zéro carbone” grâce à notre compensation en reboisement.

Read more –

Cependant nous ne nous sommes pas arrêtés là et ce malgré la crise.

Vous qui nous suivez, vous qui êtes nos talents d’aujourd’hui ou vous qui souhaitez rejoindre notre entreprise, découvrez la dernière initiative d’Aprico Consultants, leader sur le marché de la consultance et de la stratégie ICT pragmatique !

Il s’agit d’une pépinière ! Elle se situe dans le parc Ankarana situé au nord-ouest de Madagascar et plus précisément dans le petit village de Matsaborimanga. Elle s’inscrit dans un vaste programme de reboisement et est gérée par nos partenaires locaux !
C’est notre engagement : aider notre planète à s’épanouir pour les générations à venir.

En commentaires sur LinkedIn, faites-nous part de votre avis sur l’importance de maintenir un tel engagement de manière durable pour une entreprise leader sur le marché de la consultance et de la stratégie ICT en cette période de crise COVID19 !

What is Application Performance Monitoring?

Networks used to be comprised of dozens of applications. Today, it's more likely to be hundreds or thousands. As networks become far broader in nature, it becomes more difficult to track down poorly performing applications. The answer is Application Performance Monitoring — the ability to improve and automate application management.

What is APM?

Application Performance Monitoring is the process by which the performance of applications, including resource usage and the user experience of these applications, is continuously monitored. It’s increasingly critical for high-performance networks today.

Through APM, applications are continuously monitored in terms of their system-wide performance and their user-performance. Issues are reported, analyzed, prioritized, and escalated, so they can be resolved as efficiently as possible depending on how serious they are. APM, today, is critical for a network of any complexity. It can be augmented with advanced solutions such as artificial intelligence and machine learning, to identify trends and respond to patterns that would otherwise be missed.

What do APM Metrics measure?

Metrics are a mix of technical metrics and soft scores. An APM might include error rates, latency, request rates, and application availability. Likewise, APM metrics may also include user satisfaction scores. APM is about both whether the application is operating correctly on a technological level and whether it’s operating correctly for users. This is part of what makes it so useful for organizations.

Benefits of application performance monitoring

Through application performance monitoring, companies are able to get a better look at how each individual application is functioning. They are able to pare down to the applications that may be causing the most resource cost or confusion on the organization, and they are able to adjust to the changes that the environment needs. Poorly performing applications are able to be identified, both in terms of technical issues and user experience, and changes can be made to either replace these solutions ore fine-tune and optimize them.

Truthfully, Application Performance Monitoring is something that is now non-optional for environments of any complexity. With so many applications being used today, a lack of Application Performance Monitoring can lead to a system that is slow, sluggish, and confusing.

ABM’s best practices

Understandably, a field as complex as Application Performance Monitoring involves a lot of intricacies. But like many other networking concepts, following a collection of best practices can make the process easier.

  • Prioritize applications properly. Some applications are naturally a higher priority than others, and it’s important that any issues with these mission-critical applications be addressed faster. Within mission-critical applications, there are also mission-critical processes.
  • Make sure end-user metrics are tracked correctly. Performance metrics are not debatable, but user experience metrics are. Keep in mind that systems that are more complex may get higher user score rankings, or user satisfaction metrics may relate to soft processes rather than the technology behind them.
  • Define custom reports. Custom reports and custom metrics are what makes any type of performance management and monitoring applicable to the business itself. Every network is different, and every environment needs a custom solution.
  • Continuously review results. Application Performance Monitoring shouldn’t just be utilized when there is an issue emergent. It should be continuously monitored for trends, patterns, and issues, rather than being pulled out when there is an issue significant enough to cause problems.

As with other types of network standardization and policy, the goal is to ensure that your monitoring is as consistent as possible. The more consistent it is, the less likely you are to run into errors — and if your processes prove not to be effective enough, the easier it will be to manage change related to them.

The most popular APM tools

There are many popular APM tools available. Dynatrace, Application Inishgts by Microsoft, DataDog APM, Loupe, and Stackify all lead in terms of market share. For organizations using Google Cloud or Amazon Web Services, the Google Cloud Monitoring and Logging solution provides complete monitoring services — including application management. Within the Google Cloud, the APM solution is called “StackDriver.” It’s one of the 90 utilities that Google Cloud Services currently includes.

For any organization, Application Performance Monitoring is now an essential feature. Networks can now become sprawling, and it can be difficult to isolate unintended behavior, network slowdowns, and other issues. The right metrics can make managing and monitoring the system easier, and free up internal networking staff to concentrate on other issues. Wondering whether and why you need APM? Contact us today.

Extended reality (XR) And what if extended reality became… reality?

By combining augmented reality (AR), virtual reality (VR) and mixed reality (MR), extended reality (XR) could be generalized as immersive technology over the next five years, thus leaving the leisure field in order to establish itself in the industry.

Augmented reality and virtual reality are already a big part of our everyday life. Reminder: AR adds computer-generated elements to the real world (such as sounds, videos and GPS data), on smartphones or glasses in particular. VR aims for total immersion in a simulated digital environment. The users wear a helmet and can see an artificial world displayed in a 360° vision. It allows, for example, transportation in an imaginary and totally virtual setting.

Today, this VR technology has already passed the stage of the gaming or leisure sector in order to establish itself in fields such as medicine, construction, engineering, training or the military world, with products such as Oculus Rift, Google DayDream or PlayStation VR. MR, finally, makes augmented reality and virtual reality coexist within a ‘hybrid’ reality in order to provide a digital immersion experience during which the users interact with digital objects in a real world.

New opportunities

Although these technologies – grouped under the label of ‘extended reality’ – already exist for some time, they have been limited to very specific fields until now. But recently, companies have started to show interest in the potential offered by these technologies, to the point that observers are now talking about ‘extended reality’ (XR). This market is expected to reach $209 billion by 2022, or an eightfold increase compared to the current value of the market, according to the American consultancy firm Visual Capitalist. Gartner – which speaks of an ‘immersive experience’ – considers that by 2022, 70% of the companies will have launched pilot projects in immersion, both at professional and general level, while 25% of the initiatives will be deployed in production.

Consequently, there are many opportunities for XR technology. In the distribution industry, XR allows customers to try a product before buying it. Training is another major field for XR, especially when it concerns remote training or training implying danger for public health (emergency services, surgeons, chemists), during which it will be possible to learn to react to a delicate situation without taking unnecessary risks. The marketing industry is also interested in XR to analyze the reactions of consumers and prospects. The same goes for the real estate industry, which will allow future buyers to virtually visit a house, and construction companies or architects to visualize the works before their realization.

Obstacles

Although the field of possibilities is nearly endless, especially at a time when the processing capacities are constantly increasing while the cloud offers maximum flexibility, the success of XR will nevertheless involve the removal of certain obstacles. First of all, we must collect and process very large (often personal) volumes of data, which should be secured as much as possible (taking into account the General Data Protection Regulation or GDPR).

Subsequently, the cost of deployment of these technologies will have to be reduced, even when prices are expected to drop following the massive adoption of these technologies. This also goes for the prices of the devices. Moreover, the connectivity will have to be improved. 5G could be an interesting solution in this context.

Specifically in terms of IT, these new XR technologies will have to be integrated into the existing IT infrastructures in order to fully exploit their potential, by considerably increasing storage and processing capacities, especially in the cloud. Moreover, analytics and artificial intelligence will allow to analyze the large volumes of data to be processed, while the capacity of the networks will have to be adapted accordingly.

Cooperation

Globally speaking, the deployment of XR should allow to reduce costs, to improve operational performance and to generate value, knowing that one of the major benefits of XR is that it can shorten (or even eliminate) distance. Consequently, in order to optimally exploit technology, we must redesign the processes by taking the absence of distance as a starting point. This will force us to think in a completely different way, while neglecting physical barriers. More than ever, IT will be a facilitator, but it is up to the business to consider the potential offered.

Furthermore, the contribution of an external partner could be interesting when considering new opportunities. In this context, the role of the integrator will be decisive in order to select the offer that best meets current and future needs. Aprico Consultants is a consulting company specialized in the architecture and transformation of information systems. By resolutely accelerating the digital transformation process, the company provides its customers with the flexibility, performance and competitiveness required to allow them to strengthen their market position. Aprico Consultants cooperates with its clients to convert their strategy, goals and constraints into pragmatic transformation programs delivering real added value and a proven return on investment.

More information: marketing @aprico-consult.com

Microservices vs API

Ever since the smartphone was first introduced to the world, software development has essentially changed forever. Gone are the days when apps needed to do everything and anything under the sun to stand out in a crowd. Rather than becoming a "Jack of all trades, master of none," apps are now designed to do one (or a few) things incredibly well - thus creating a competitive advantage for themselves in an era where there's "always an app for that."

Because of this trend in the way people are interacting with their software, the entire DevOps process has evolved as well. Regardless of the type of software you’re trying to build or who you’re building it for, two of the most critical concepts to understand today involve microservices and APIs. But what is a microservice and what is an API, and what situations are they commonly used for? The answers to these questions require you to keep a few key things in mind.

What is a Microservice?

To understand what a microservice is, you must first understand what it is acting in contrast to – the monolithic applications of an era that has essentially ended.

Rather than designing one massive application, the microservices architecture structures those same apps as a collection of individual services. In other words, you’re designing an application as a collection of smaller services, each one running its own processes in an essentially independent fashion from the rest. Together, they still make up that larger whole that you would get with a monolithic application – but they’re far easier to build and far more efficient (not to mention faster) to deploy.

As a result of this, the benefits of microservices are as important as they are plentiful. Chief among these is the fact that each microservice is itself highly maintainable and testable, thanks to the fact that they are all also independently deployable. So not only do you not have to wait for the entire application to be “finished” before rolling out critical functionality to end users, but troubleshooting microservices is also dramatically easier because everything is so loosely coupled.

If an issue is identified within one microservice, it can be tested and fixed independently without potentially negatively impacting anything else in the architecture. This allows DevOps professionals to embrace a variety of different techniques, like unit testing. This helps to make sure that when given the correct parameters, the microservice in question provides the right information – thus guaranteeing that it is operating correctly. Contract testing is also a big part of this, which tests the communication layer between the microservice and whatever other element that the microservice is communicating with at the time.

Additionally, microservices have become so popular thanks to the fact that they’re highly organized around business capabilities. This, coupled with the fact that each microservice is owned by a small team of developers who can devote the appropriate amount of time and attention to it, creates a better experience for both DevOps professionals and the audience they’ve dedicated themselves to serving.

What is an API?

API is an acronym that is short for application program interface. It’s a term used to describe a set of routines, protocols and even tools that are commonly used when building software applications.

In a general sense, think of an API as a set of instructions regarding how the various components that make up a piece of software should interact with one another. Note that the use of a third party API can also be a viable way to get an application you’re developing to better communicate with one that already exists.

That second point is particularly important as most of the time, APIs are used to allow one computer program to use some level of functionality from another. A general example of this might be a store-centric app on a smartphone that also allows you to immediately get directions from your location to that store via the “Maps” app on your device. The only way the store’s branded app would be able to do that would be via an API, as it would need to “communicate” with that separate app to share relevant information to make sure the directions were accurate.

The Difference Between APIs and Microservices

While these two concepts are undoubtedly related, they are also describing totally separate ideas and should always be treated as such.

Remember that an API is essentially a way for a consumer to use some underlying service within an application. A microservice is a much broader idea, describing an actual architectural design methodology that separates portions of an application into those smaller, self-contained services.

So from that perspective, an API would be a way to allow the interaction with either a microservice or a portion of that microservice. It gives the user options regarding what they can do, while the microservice itself carries out the functionality in question.

In Summary

While it is absolutely true that both an API and a microservice involve the structure of a piece of software, they’re describing two totally different things. But the important thing to understand is that they’re not concepts that are in conflict with one another – you wouldn’t choose an API or a microservice during development. Instead, they’re two sides of the same coin, ultimately supporting and empowering one another to create the most important benefit of all – more freedom and flexibility for end users regarding the capabilities that they interact with on a daily basis.


Robotic Process Automation, Automation as a guarantee of quality and savings

At a time of cost reduction and shortage of IT personnel, RPA or robotic process automation may appear as a solution for automating repetitive tasks. Plus, it can facilitate digital transformation. So, is it the umpteenth term in vogue for the IT industry to enjoy? Or is it truly a disruptive technology? No doubt its success will depend on its integration into the overall business strategy.

If the Gartner research firm is to be believed, which views hyperautomation as one of the ten key technologies for this year, organizations using hyperautomation technologies will be able to reduce their operational costs by around 30% by 2024. Its consultants also estimate that the robotic process automation or RPA market could represent a turnover of 7 billion euros this year and constitute the most promising market segment in the field of enterprise software. Knowing that 40% of large companies will already have adopted an RPA software tool this year, compared to 10% in 2019.

RPA: a buzzword?

Gartner describes hyperautomation or RPA as follows: “This is an approach that allows organizations to quickly identify and automate as many processes as possible. RPA involves the use of a range of technological tools, including but not limited to machine learning, intelligent software solutions and automation tools.”

RPA aims to automate as much as possible those business processes that used to require human intervention up until now. In practice, the organization configures a software application, or a robot, that is capable of capturing and interpreting data in order to process repetitive operations. This brings advantages such as a greater speed of intervention, an improved quality (no risk of human error and a 24/7 operation) and obviously also lower operational costs (some observers suggest returns on investment of 30 to 200 % the first year).

In addition, by integrating technologies such as artificial intelligence, machine learning, natural language processing or voice recognition, RPA is proving to be even more efficient and interesting. And if you also add ITSM (IT service management) or process automation solutions, the benefits will be multiplied even further.

Limitations

Yet RPA is not a panacea, as the technology is not suitable for all types of organizations. Indeed, to allow the generation of relatively stable rules, it requires processing large volumes of transactions and very repetitive processes. This is particularly the case in the air transport sector (air tickets), in the financial world (credit cards, banking operations), in the public sector (services to citizens) or in many support functions (human resources, helpdesk , purchasing or IT). Even though a research firm like Deloitte believes that RPA is “a solution that can process any input data by running a series of pre-programmed actions, like a macro, and following predefined business rules.”

Another pitfall of RPA: entrusting (too many?) responsibilities to the robot, to the detriment of human workers. Especially since 9% of the global IT workforce could be threatened by this hyperautomation, according to consultant Forrester Research.

Also, RPA can only be successful if it is part of an overall strategy. Indeed, it is about rethinking business processes and their entire business model. And therefore about setting up a vast digital transformation program.

Finally, designing, interconnecting and managing thousands of robots may prove to be more complex and costly than most organizations initially imagined. Kind of like the cloud when it started.

Integration

As with many new IT technologies, the watchword of RPA is “integration”, as Gartner explains. “RPA may provide quick relief as a noninvasive form of integration. However, processes are not always simple, routine, repetitive and stable. They may be long running, and they often involve intelligent automated decision making and optimization. The real challenge — to scale beyond the initial few low-hanging fruits of routine processes — cannot be solved by a single tool or with siloed strategies.”

Aprico Consultants is a consulting company specializing in the architecture and transformation of information systems. By resolutely accelerating their digital transformation processes, the company provides its customers with the required flexibility, performance and competitiveness to strengthen their position on the market. Aprico Consultants collaborates with its clients to translate their company’s strategy, objectives and constraints into pragmatic transformation programs that deliver real added value and a proven return on investment.

“Given the potential that RPA holds, it is vital for organizations to develop a digital transformation strategy that encourages an automation-first mindset. While it is difficult to predict the technologies that will emerge in 2020, only innovative organizations will be able to react, adapt and lead,” Gartner concludes.

Edge computing – The link between IoT and the cloud

As the Internet of Things (IoT) gains momentum, a new technology emerges, connecting objects to the hyperscalers’ clouds: edge computing. The objective: shifting computing power as close as possible to the IoT devices.

It looks like we all agree the Internet of Things is the next big revolution. IDC estimates that the number of connected objects will increase from 80 billion in 2025 to more than 500 billion by 2030. The Boston Consulting Group estimates the IoT market at 267 billion dollars this year.

As a matter of fact, the Internet of Things is forcing organizations to rethink their IT architecture. In short, it is a question of deploying connected objects everywhere and ensuring the processing of the data that is being produced. However, current networks are generally not reliable or fast enough to guarantee the resilience, security and availability IoT workloads need. At the same time, IT department has to make the whole thing work with limited resources, both in terms of computing power and staff, despite the fact that IoT inevitable comes with gigantic volumes of data.

Why edge?

Basically, edge computing is all about taking out part of the IT infrastructure, mainly processing and storage resources, and move it outside the walls of the company to install it as close as possible to the connected devices. The generated data is sent either to the organization’s data center or to a hyperscale cloud, such as Amazon Web Services (AWS), Microsoft Azure, IBM Cloud, Google Compute Engine or others. In other words, edge computing acts as a high-performance gateway between local processing resources and the cloud, private or public. According to Gartner, 10% of data is already generated and processed outside the data center, a number that will reach 50% by 2022.

Several drivers justify the shift to edge computing. Sometimes the internal IT infrastructure is too rigid or just not scalable enough, preventing it from timely and adequately processing the IoT workload. Likewise, local systems may have too little processing or storage resources. Furthermore, it may occur that local solutions are too heterogeneous, as they were deployed over time with little consistency, resulting in limited availability and effectiveness. Finally, the IT staff available locally is often limited as well, and not always able to intervene efficiently and timely. That is a problem as IoT generally requires maximum reactivity to guarantee high availability.

Advantages for IT and business

One of the major advantages of edge computing is that of the reduced need for human resources. IoT technology largely automates the collection and transmission of data. In addition, possible problems are detected and corrected automatically. Thanks to machine learning and automation, the edge works with virtually no human intervention. Likewise, the IT expertise required at local sites is limited, which is an important advantage in the context of the ongoing war for IT talent.

Another major advantage of edge computing is the ease of deploying and managing the infrastructure. Processing, storage, virtualization, backup and disaster recovery functions are integrated within a single platform, avoiding working within silos. Adding new resources can be done without interrupting the service. And given the extensive automation as well as the intuitive edge computing user interface, IT costs can be significantly reduced both in terms of implementation and management. Finally, edge computing offers greater flexibility since it is possible to add (or remove) equipment, depending on the evolution of the activity.

Today, most IT suppliers market a wide variety of edge computing hardware, including the edge chip, the micro-data center, micro-controllers, industrial PCs or hyperconverged servers. According to IDC, edge computing could take off in the context of the deployment of 5G mobile networks, as operators integrate micro-data centers – whose space can be rented for edge computing applications – into their 5G antennas.

Ecosystem

It is clear that the deployment of edge computing will lead to a complete overhaul of IT platforms and business practices in the context of digital transformation. Aprico Consultants is a consulting company specializing in the architecture and transformation of information systems. By accelerating the digital transformation process, we provide our customers with the flexibility, performance and competitiveness they need to strengthen their market position and improve their customer service.

SAM: synonym for compliance, efficiency and agility

For a company, the implementation of a software asset management (SAM) solution has many benefits in terms of compliance, efficiency and agility, not to mention lower costs, time saving and risk reduction – all of this with better governance. Too good to be true? Not so sure…

A software asset management (SAM) solution aims to manage and optimize the purchase, deployment, maintenance and use of software within an organization, as well as the disposal of unused software or software purchased without the approval of the IT department. Although the first implementation of a SAM program may seem insurmountable for a company, a large-scale deployment is a major challenge. Nevertheless, the consulting company Deloitte estimates that the implementation of SAM allows to reduce by 25% a company’s annual software expenditures, as from the first year onwards. Given that, according to Gartner, the expenditures for software licenses and maintenance amount up to 22% of a company’s IT budget, it is probably not superfluous to acquire such a SAM solution at a time when budgetary savings are more than ever on the agenda.

However, a study performed in 2018 by IDG Research among CIO’s indicates that 72% of the organizations don’t have a real SAM strategy, that 74% of them haven’t implemented a formal SAM structure and that 83% don’t consider SAM as a strategic initiative.

Multiple benefits

One of the most obvious benefits of SAM is of financial nature, of course. Indeed, SAM allows to identify obsolete software which will subsequently be replaced by more modern or more efficient programs, or updated – if necessary – in order to guarantee optimal productivity, or even just disposed. In other words: we must avoid the overconsumption of software, given that it has been estimated that 36% of the software is underused or even not used at all. Sometimes there is also an underestimation of the number of users using software, and thus of the number of licenses to register, leading to financial risks. Similarly, the use of the cloud and shadow IT favors the emergence of unlisted solutions whose use isn’t entirely controlled by the IT department. (Shadow IT is the phenomenon whereby applications escape from the visibility, control or management of the IT department, because they have been purchased on an employee’s or a department’s own initiative, without prior consultation of the IT department.)

Another major interest of SAM: time saving for the IT department. Indeed, the software landscape can be established more easily and more quickly (number of licenses, location in the infrastructure) and subsequently managed more proactively, while adaptations and modifications will be easier to perform.

Moreover, simplified management implies risk limitation. Indeed, a controlled software environment will reduce the risk of attacks (the software is known, updated and patched correctly), while the software not authorized by the IT department, should disappear.

In terms of security, it is obvious that the use of the cloud in particular or the use of unauthorized licenses significantly increases the risk of (cyber)attacks. Similarly, good governance requires the exhaustive mapping of all programs as well as the most transparent management of the software assets.

Finally, the decision-making process will allow to (better) understand the use of the software and data by means of a precise centralized inventory of the resources, and thus to optimize their use.

Involving the business

Apart from the benefits for the IT department, SAM will also be interesting for the legal department, by means of a more precise visibility on the software fleet, which will allow to respond to any possible audit. The purchasing department and the financial department will be able to better control the software purchases and to better negotiate the contracts with the software publishers. The human resources department will be able to ensure user governance with regard to the directives and regulations of the company, as well as the optimization of the software resources in relation to the employees.

To put it briefly: if the IT department triggers a SAM project, all departments and business users must be involved, under the supervision of the general management. If so, they must rely on a trusted partner, capable of assisting the organization for the preparation, deployment and follow-up of the software asset management. Preferably, this partner should understand the IT structure of the company and propose a data collection and analysis methodology, before suggesting any improvements or adaptations to optimize the software management and to offer a return on investment for the project.

Aprico Consultants is a leading consultancy company that guides the strategy and transformation to boost the performance, productivity and competitiveness of your organization. We combine cutting-edge expertise with a perfect understanding of the customer’s context and experience, as well as an end-to-end approach in all sectors, from consultancy to the deployment of solutions. Therefore, we can offer tailor-made and pragmatic solutions oriented towards efficiency and profitability.

More information: marketing@aprico-consult.com

Careers opportunities

We’re always looking for talented people.
Are you one of those?