Cybersecurity isn’t merely the concern of your IT department: it is everyone's business. Not only does it concern all levels of your decision-making, up to the highest level, it also concerns each and everyone of your employees. Not to mention your external partners, since your company is increasingly opening up to the outside world.
In a recent study entitled ‘Risk Value 2018’, NTT Security assessed 1,800 companies’ cybersecurity policy. According to that global survey, nearly one third of Benelux companies are not well prepared for a cyber attack. Also, they most often fail to advance their prevention and preparedness policy. In fact, only 45% of Benelux respondents say they have established an IT security policy, i.e. 12% below the international average. And those same companies spend only 12% of their ICT budget on cybersecurity. And there’s more: 34% of companies in the Benelux say they would be prepared to pay a ransom if they fell victim to a cyber attack, such as a ransomware infection. In addition, the survey shows that the distribution sector is the least prepared for a cyber attack, followed by the transportation sector, the wholesale trade and the services industry. The telecoms, pharma, chemistry and technology industries are better protected.
Leadership required
At the global level, the study holds some other surprises. To begin with, only 19% of companies regard the Chief Information Security Officer (CISO) as the person who is ultimately responsible for IT security, whereas 22% refer this responsibility to the CIO and 20% to the CEO. This shows a great dilution of responsibilities and skills in IT security. And the new General Data Protection Regulation (GDPR) does not help matters since a Data Protection Officer (DPO) has now also been added to the list of functions. And here’s another other disturbing survey result: barely 57% of organizations have a well-established security policy, while 26% are still working on it. Finally, only 39% of managers believe that their employees fully understand the security measures defined by their company.
Within your company, clear and effective leadership must therefore be established, especially since digital transformation requires a solid and secure foundation. It will be up to your CISO to define your IT security policy and to raise awareness for it among all your stakeholders: management, staff, business partners, etc. All the more so as with the emergence of the Internet of Things (IoT), the cloud, social networks and mobile devices, new attack vectors are equally emerging. It is imperative that your security policies are embedded in your daily business and that data-centric incident management solutions are deployed throughout the life cycle of your data.
Cloud to the rescue
In addition, the cloud requires a clear view of the movement of your data, wherever it is, as well as measures to protect that data and specific procedures for incident management. However, along with the cloud we also see the emergence of managed security solutions that exploit the potential of artificial intelligence and machine learning to identify threats as quickly as possible and counter them with maximum effectiveness. This could well be an interesting solution to the glaring shortage of specialized security profiles.
Moreover, the cloud can present an immense potential for computational power and flexibility, enabling ultra-sophisticated algorithms that are capable of analyzing threats in real time, of modelling risks and of providing a quick response in case of attack. Similarly, the cloud can enable closer collaboration between security actors by sharing threat information (especially details on attack life cycles) and information on cyber criminals (including their most commonly used tactics, techniques and procedures). This is notably the mission of the Cyber Threat Alliance (CTA) and its project Adversary PlayBooks.
Collaboration: the way to go
More than ever, collaboration seems the right way to fight cybercrime effectively. Collaboration not only between departments within your company (transparency is without a doubt the best approach in case of attack), but also collaboration with your technology partners.
As an IT specialist, Aprico Consultants helps organizations establish their ICT strategy and assists them in their digital transformation, in order to improve the performance, productivity and competitiveness of their business. We combine in-depth knowledge of various aspects of ICT with technology expertise and an end-to-end understanding of our clients' business processes.
Given that the cybersecurity market is particularly fragmented (even though a certain consolidation of the ecosystem is under way), the choice of a reliable and trusted technology partner is essential in your search for a sustainable and global ICT platform. In other words: a partner who is capable of not only selecting the most relevant offer, but also of deploying and maintaining it.
Aprico aims to help companies innovate and rethink their business processes by putting security at the centre of their strategic thinking. We share best practices, technologies and organizational models that allow your organization to open up to the outside world and to share information securely.
In a recent study entitled ‘Risk Value 2018’, NTT Security assessed 1,800 companies’ cybersecurity policy. According to that global survey, nearly one third of Benelux companies are not well prepared for a cyber attack. Also, they most often fail to advance their prevention and preparedness policy. In fact, only 45% of Benelux respondents say they have established an IT security policy, i.e. 12% below the international average. And those same companies spend only 12% of their ICT budget on cybersecurity. And there’s more: 34% of companies in the Benelux say they would be prepared to pay a ransom if they fell victim to a cyber attack, such as a ransomware infection. In addition, the survey shows that the distribution sector is the least prepared for a cyber attack, followed by the transportation sector, the wholesale trade and the services industry. The telecoms, pharma, chemistry and technology industries are better protected.
Leadership required
At the global level, the study holds some other surprises. To begin with, only 19% of companies regard the Chief Information Security Officer (CISO) as the person who is ultimately responsible for IT security, whereas 22% refer this responsibility to the CIO and 20% to the CEO. This shows a great dilution of responsibilities and skills in IT security. And the new General Data Protection Regulation (GDPR) does not help matters since a Data Protection Officer (DPO) has now also been added to the list of functions. And here’s another other disturbing survey result: barely 57% of organizations have a well-established security policy, while 26% are still working on it. Finally, only 39% of managers believe that their employees fully understand the security measures defined by their company.
Within your company, clear and effective leadership must therefore be established, especially since digital transformation requires a solid and secure foundation. It will be up to your CISO to define your IT security policy and to raise awareness for it among all your stakeholders: management, staff, business partners, etc. All the more so as with the emergence of the Internet of Things (IoT), the cloud, social networks and mobile devices, new attack vectors are equally emerging. It is imperative that your security policies are embedded in your daily business and that data-centric incident management solutions are deployed throughout the life cycle of your data.
Cloud to the rescue
In addition, the cloud requires a clear view of the movement of your data, wherever it is, as well as measures to protect that data and specific procedures for incident management. However, along with the cloud we also see the emergence of managed security solutions that exploit the potential of artificial intelligence and machine learning to identify threats as quickly as possible and counter them with maximum effectiveness. This could well be an interesting solution to the glaring shortage of specialized security profiles.
Moreover, the cloud can present an immense potential for computational power and flexibility, enabling ultra-sophisticated algorithms that are capable of analyzing threats in real time, of modelling risks and of providing a quick response in case of attack. Similarly, the cloud can enable closer collaboration between security actors by sharing threat information (especially details on attack life cycles) and information on cyber criminals (including their most commonly used tactics, techniques and procedures). This is notably the mission of the Cyber Threat Alliance (CTA) and its project Adversary PlayBooks.
Collaboration: the way to go
More than ever, collaboration seems the right way to fight cybercrime effectively. Collaboration not only between departments within your company (transparency is without a doubt the best approach in case of attack), but also collaboration with your technology partners.
As an IT specialist, Aprico Consultants helps organizations establish their ICT strategy and assists them in their digital transformation, in order to improve the performance, productivity and competitiveness of their business. We combine in-depth knowledge of various aspects of ICT with technology expertise and an end-to-end understanding of our clients’ business processes.
Given that the cybersecurity market is particularly fragmented (even though a certain consolidation of the ecosystem is under way), the choice of a reliable and trusted technology partner is essential in your search for a sustainable and global ICT platform. In other words: a partner who is capable of not only selecting the most relevant offer, but also of deploying and maintaining it.
Aprico aims to help companies innovate and rethink their business processes by putting security at the centre of their strategic thinking. We share best practices, technologies and organizational models that allow your organization to open up to the outside world and to share information securely.
While the cloud clearly remains one of the cornerstones of your company's IT flexibility, a new style of architecture is now increasingly needed: microservices. It promises greater ease of development and deployment, better scalability, easier maintenance and more flexibility in the way you engage with technology.
Your company has always faced many IT challenges: whether to ensure the availability and performance of your applications, as well as their quick and inexpensive upgrade, or to develop and deploy new solutions more quickly.
Advantages
In short, your organization, if it wants to keep up, must be synonymous with responsiveness and flexibility in order to fully meet the demands of your business. Yet all too often your IT department inhibits this dynamic process, because of your installed base (i.e. your historical applications), the inflexibility of your infrastructure and the lack of flexibility of your development teams. Of course the cloud - and especially the hybrid multi-cloud (see our previous blog) - can provide an answer to these challenges. But nowadays, it’s possible to go beyond that answer, thanks to microservices.
Schematically, a microservices-based architecture aims to develop an application as a suite of small services, each operating its own process and communication through lightweight mechanisms. These services are built around business functionality and can be deployed independently as part of an automated process, while centralized management is minimized.
In other words, a microservices architecture offers you several advantages: shorter development cycles, scalability built in from the start of your development, the possibility of deployment on-premise or in the cloud, the ability to handle complex requirements, less vendor lock-in thanks to the many products available in open source, the possibility to choose the best implementation technology to solve your specific problem, as well as ease of maintenance and upgrade.
Constraints
While the microservices approach has many advantages, adopting it requires that you respect a number of conditions. For one thing, as a complex technology and architecture, it requires certain specialized skills. In addition, a DevOps and automation culture is absolutely essential. Finally, the boundaries of each service must be clearly defined, while organizational changes are also needed to ensure the success of such a project.
In short, you definitely shouldn’t look at microservices as the new Eldorado for your IT department: this type of architecture is only suitable if your company needs scalability, complexity and speed of implementation. And you certainly shouldn’t underestimate the organizational challenges it brings. It effectively requires you to set up inter-functional teams, multidisciplinary and autonomous, with clear boundaries between them (and therefore between your microservices). Ideally those teams have a double dimension: a vertical business aspect and a technological communication structure between teams to create the knowledge network and establish border governance between services.
DDD approach
To meet these challenges, we at Aprico propose a type of structure, based on the idea of DDD or Domain-Driven Design. It is neither a framework nor a methodology, but rather an approach (as described in the book of the same name by Eric Evans) that aims to define a common vision and language shared by all people involved in the development of an application.
In practice, DDD allows you to offer tools capable of establishing the service boundaries, the upper limit being the associated context and the lower limit the aggregate. Domain events are powerful building blocks for service orchestration, while domains constitute natural boundaries for the business-oriented team. In addition, context integration and team relationships are governed by strategic design.
Partnership
Microservices represent the logical evolution of a distributed systems architecture. They are intended to meet your needs of complexity, scalability and speed of delivery. But to succeed, your organization must be aligned. DDD offers powerful tools for structuring a microservices architecture through aggregates, associated contexts, domain events and the strategic design that ensures relationships between contexts and teams.
As a consulting firm specializing in information systems architecture and transformation, Aprico Consultants allows you to resolutely accelerate your digital transformation processes. We provide you with the flexibility, performance and competitiveness needed to strengthen your position in the market. Aprico Consultants works with you to translate your company's strategy, objectives and constraints into pragmatic transformation programs that deliver real added value and proven return on investment.
Microservices architecture (MSA)
Even when the cloud has caused much ink to flow, it seems that most organizations prefer hybrid cloud, combining private and public cloud, to enjoy the best of both worlds. Transparency and portability remain the big challenges. Hence the need to call on the services of a professional provider to make the right choice.
According to a study by IDC, companies will spend $554 billion on cloud computing and related services by 2022, twice as much as in 2016. A survey by IDG among 550 IT decision-makers revealed that 73% of the organizations have already adopted the cloud, while 17% plan it in the next 12 months and only 10% have a migration planned within the next 3 years. On-premise data and applications are not becoming obsolete, but implementation models are evolving into architectures that combine public and private clouds, as well as on-premise data and SaaS-based applications.
In addition, a study by Computer Profile revealed that in recent years, the adoption of cloud solutions has accelerated considerably. In 2010, only 13% of Belgian companies used one or more cloud solutions. In 2018, that number rose to 64%: 5 times the number of 8 years ago. The survey also revealed that the cloud is most popular among SMEs.
The cloud, yes but...
Among the reasons that justify cloud investments, companies mention the need to accelerate IT service provisioning, the need to have more flexibility in responding to changing market conditions, the need to have guaranteed business continuity, the reduction of development time, the improvement of service and customer support, and the reduction of Total Cost of Ownership (TCO).
On the other hand, major cloud adoption obstacles include vendor lock-in (47%), security risks (34%), the location of the data storage (34%), the shortage of skills to manage the cloud and maximize the investment (31%), as well as difficulties related to integration (29%).
Hybrid is the answer
Facing this reality, more and more companies are choosing hybrid cloud, combining the benefits of public cloud and on-premise private cloud. But this approach requires rigorous and technically challenging management skills, especially in terms of security, as well as a need for transparency when switching from one environment to the other. Therefore, the most important choice is that of the technology provider.
According to the IDG study, companies manage an average of 4 Software as a Service providers (SaaS), 3 Platform as a Service partners (PaaS) and 2 Infrastructure as a Service vendors (IaaS). At the same time, they consider increasing the number of their partners in the next 18 months (5 in SaaS, 3 in IaaS and 3 in PaaS).
Conclusion of the study: “Organizations are no longer wondering whether they should turn to the cloud. They are now looking to optimize the use of new cloud services by adopting new delivery models and implementing multi-cloud architectures.”
Challenges
While the adoption of the cloud was initially justified by cost savings, it now seems that the cloud is needed at the very request of the IT department to improve the availability of IT services, speed up development and deployment, expand the IT services portfolio, increase flexibility, and offer better disaster recovery capabilities. This evolution increases the need for integration between cloud provider platforms, even if the arrival of Kubernetes-based managed service offerings should facilitate compatibility between solutions. In addition, the maturity and depth of the offers, as well as the pricing of the products, will be part of the selection criteria, as is customer experience.
Proving the success of hybrid multi-cloud, IDC announced that for the first time sales of traditional IT infrastructure has been exceeded by sales of cloud infrastructure in the 3rd quarter of 2018. Sales of cloud infrastructure are expected to continue growing by 13.3% in cumulative growth by 2020 to reach 57.6% of all sales.
In this context, companies and their IT departments will benefit from relying on their IT partners to select the offer that best meets their needs - current and future. Aprico Consultants is a consulting firm specializing in architecture and information systems transformation. By accelerating the digital transformation process, Aprico provides its customers with the flexibility, performance, and competitiveness they need to strengthen their position in the market. Aprico strongly collaborates with clients to translate their strategy, objectives, and constraints into pragmatic transformation programs that deliver real added value and proven return on investment.
According to a study by IDC, companies will spend $554 billion on cloud computing and related services by 2022, twice as much as in 2016. A survey by IDG among 550 IT decision-makers revealed that 73% of the organizations have already adopted the cloud, while 17% plan it in the next 12 months and only 10% have a migration planned within the next 3 years. On-premise data and applications are not becoming obsolete, but implementation models are evolving into architectures that combine public and private clouds, as well as on-premise data and SaaS-based applications.
In addition, a study by Computer Profile revealed that in recent years, the adoption of cloud solutions has accelerated considerably. In 2010, only 13% of Belgian companies used one or more cloud solutions. In 2018, that number rose to 64%: 5 times the number of 8 years ago. The survey also revealed that the cloud is most popular among SMEs.
The cloud, yes but…
Among the reasons that justify cloud investments, companies mention the need to accelerate IT service provisioning, the need to have more flexibility in responding to changing market conditions, the need to have guaranteed business continuity, the reduction of development time, the improvement of service and customer support, and the reduction of Total Cost of Ownership (TCO).
On the other hand, major cloud adoption obstacles include vendor lock-in (47%), security risks (34%), the location of the data storage (34%), the shortage of skills to manage the cloud and maximize the investment (31%), as well as difficulties related to integration (29%).
Hybrid is the answer
Facing this reality, more and more companies are choosing hybrid cloud, combining the benefits of public cloud and on-premise private cloud. But this approach requires rigorous and technically challenging management skills, especially in terms of security, as well as a need for transparency when switching from one environment to the other. Therefore, the most important choice is that of the technology provider.
According to the IDG study, companies manage an average of 4 Software as a Service providers (SaaS), 3 Platform as a Service partners (PaaS) and 2 Infrastructure as a Service vendors (IaaS). At the same time, they consider increasing the number of their partners in the next 18 months (5 in SaaS, 3 in IaaS and 3 in PaaS).
Conclusion of the study: “Organizations are no longer wondering whether they should turn to the cloud. They are now looking to optimize the use of new cloud services by adopting new delivery models and implementing multi-cloud architectures.”
Challenges
While the adoption of the cloud was initially justified by cost savings, it now seems that the cloud is needed at the very request of the IT department to improve the availability of IT services, speed up development and deployment, expand the IT services portfolio, increase flexibility, and offer better disaster recovery capabilities. This evolution increases the need for integration between cloud provider platforms, even if the arrival of Kubernetes-based managed service offerings should facilitate compatibility between solutions. In addition, the maturity and depth of the offers, as well as the pricing of the products, will be part of the selection criteria, as is customer experience.
Proving the success of hybrid multi-cloud, IDC announced that for the first time sales of traditional IT infrastructure has been exceeded by sales of cloud infrastructure in the 3rd quarter of 2018. Sales of cloud infrastructure are expected to continue growing by 13.3% in cumulative growth by 2020 to reach 57.6% of all sales.
In this context, companies and their IT departments will benefit from relying on their IT partners to select the offer that best meets their needs – current and future. Aprico Consultants is a consulting firm specializing in architecture and information systems transformation. By accelerating the digital transformation process, Aprico provides its customers with the flexibility, performance, and competitiveness they need to strengthen their position in the market. Aprico strongly collaborates with clients to translate their strategy, objectives, and constraints into pragmatic transformation programs that deliver real added value and proven return on investment.
These days, information must be available and accessible everywhere, at any time and by any partner of the company, including employees, customers, suppliers, prospects, and more. This requires an infrastructure that is powerful, reliable, and secure, as well as digital processes. Welcome to the virtual IT environment.
More than ever, the enterprise is moving beyond its internal borders to open up to all its stakeholders. No longer working in isolation, the organization must make its platforms and data accessible to drive growth.
Here are some numbers that explain the extent of the phenomenon. According to a study by StrategyAnalytics, the number of mobile employees will increase from 1.45 billion in 2016 to 1.87 billion in 2022 worldwide – or 42.5% of the active population. Accenture estimates that the annual number of security breaches increased by 27.4% in 2017, showing the fundamental importance of finding the right balance between unhindered mobility and information security.
According to PwC, 50% of all employees will be from Generation Y by 2020. This generation grew up using computers and mobile technology. It is clear that companies will have to adapt to the needs of Generation Y. If not, they fail to attract the talent they need and will risk to see employees leave, as 44% of employees today – according to a Dell and Intel survey – believe their professional environment is not smart enough and not connected.
Internal organization
The old architecture silos must give way to virtual environments, relying heavily on the cloud - private, hybrid or public - to facilitate the exchange of information. Similarly, the workstation becomes more virtual, especially using Virtual Desktop Infrastructure (VDI) technology, making the user more mobile and stimulating collaborative work. The underlying IT infrastructure also becomes virtual to provide greater flexibility and enable change. The rise of edge computing improves employee productivity - of both remote and on-site employees - as well as relieve pressure on the internal IT infrastructure and enhance the security level of the extended IT environment.
Road congestion and workspace costs stimulate teleworking and videoconferencing, while agile collaboration methods and collaboration in virtual teams become part of everyday business. This is why network operator Proximus estimates that by 2020, 90% of organizations will deploy telework applications for their employees. At the same time, there is a rise in the use of apps, and not only to improve productivity, whereas storage evolves into shared storage in the cloud.
Apart from mobile devices and software, IP telephony and instant communication are indispensable collaboration tools. Not to mention fixed-mobile convergence and unified communications that allow employees to work wherever they are.
The network
Thanks to the cloud, the organization are able to focus on the added value of IT and on business innovation, leaving the management of its infrastructure in the hands of trusted partners. In addition, the cloud will integrate both applications and data in a more open and flexible global system, provided the necessary security is in place. The cloud will also enable better activity-based IT resources provisioning and turn capex into opex, a particularly relevant argument in times of budget cuts.
Networks – LAN or WAN – are essential links in data transmission. The focus won’t only be on WiFi wireless networks, but also on 5G technology, which will speed up transfer speeds and allow ever larger volumes, following the evolution of unstructured data growth, images and video in particular. 5G will also facilitate the breakthrough of IoT (Internet of Things) and M2M (Machine-to-Machine). The technology is expected to arrive on the market in 2019 and will need to be quickly standardized and tested. At the same time, devices will need to be ready as well, allowing large scale deployment, especially in the professional market.
The outside world
By deploying digital processes, the organization opens up to its partners. As a result, customers enjoy a richer user experience. They find information faster, benefit from more personalized data, take advantage of automated processes (including the order process), and use different communication channels. As a result, the customer enjoys a richer offer of customized services.
Suppliers benefit from digitization and mobility to access a wider range of internal data and automate their logistics processes, which will improve operational excellence, reduce costs, and increase the levels of productivity, innovation, and creativity.
In the field
In any mobility project, it goes without saying that security is an important, if not crucial, dimension. Likewise, change management as well as user information and training need attention to ensure the widest possible adoption. And why not clean house before installing new mobile solutions?
To help companies succeed in their mobile transformation, Aprico respects and implements three basic rules: smart, lean, and agile. The smart approach aims at understanding the real added value of mobile technology for the organization and to set the business priorities to achieve the transformation. Aprico is open and independent. Its recommendations are based on the platforms and technologies that fit the client best. Aprico strongly believes in the lean improvement culture that involves planning the right resources at the right time and at the right cost. Finally, our agile methodology combines ongoing collaboration (involving stakeholders as early as possible) throughout the development process, rapid prototyping, and frequent testing (splitting the project into manageable entities resulting in regular delivery). Our approach reduces time-to-market, ensures easier change and maintenance, and ultimately improves customer satisfaction.
More than ever, the enterprise is moving beyond its internal borders to open up to all its stakeholders. No longer working in isolation, the organization must make its platforms and data accessible to drive growth.
Here are some numbers that explain the extent of the phenomenon. According to a study by StrategyAnalytics, the number of mobile employees will increase from 1.45 billion in 2016 to 1.87 billion in 2022 worldwide – or 42.5% of the active population. Accenture estimates that the annual number of security breaches increased by 27.4% in 2017, showing the fundamental importance of finding the right balance between unhindered mobility and information security.
According to PwC, 50% of all employees will be from Generation Y by 2020. This generation grew up using computers and mobile technology. It is clear that companies will have to adapt to the needs of Generation Y. If not, they fail to attract the talent they need and will risk to see employees leave, as 44% of employees today – according to a Dell and Intel survey – believe their professional environment is not smart enough and not connected.
Internal organization
The old architecture silos must give way to virtual environments, relying heavily on the cloud – private, hybrid or public – to facilitate the exchange of information. Similarly, the workstation becomes more virtual, especially using Virtual Desktop Infrastructure (VDI) technology, making the user more mobile and stimulating collaborative work. The underlying IT infrastructure also becomes virtual to provide greater flexibility and enable change. The rise of edge computing improves employee productivity – of both remote and on-site employees – as well as relieve pressure on the internal IT infrastructure and enhance the security level of the extended IT environment.
Road congestion and workspace costs stimulate teleworking and videoconferencing, while agile collaboration methods and collaboration in virtual teams become part of everyday business. This is why network operator Proximus estimates that by 2020, 90% of organizations will deploy telework applications for their employees. At the same time, there is a rise in the use of apps, and not only to improve productivity, whereas storage evolves into shared storage in the cloud.
Apart from mobile devices and software, IP telephony and instant communication are indispensable collaboration tools. Not to mention fixed-mobile convergence and unified communications that allow employees to work wherever they are.
The network
Thanks to the cloud, the organization are able to focus on the added value of IT and on business innovation, leaving the management of its infrastructure in the hands of trusted partners. In addition, the cloud will integrate both applications and data in a more open and flexible global system, provided the necessary security is in place. The cloud will also enable better activity-based IT resources provisioning and turn capex into opex, a particularly relevant argument in times of budget cuts.
Networks – LAN or WAN – are essential links in data transmission. The focus won’t only be on WiFi wireless networks, but also on 5G technology, which will speed up transfer speeds and allow ever larger volumes, following the evolution of unstructured data growth, images and video in particular. 5G will also facilitate the breakthrough of IoT (Internet of Things) and M2M (Machine-to-Machine). The technology is expected to arrive on the market in 2019 and will need to be quickly standardized and tested. At the same time, devices will need to be ready as well, allowing large scale deployment, especially in the professional market.
The outside world
By deploying digital processes, the organization opens up to its partners. As a result, customers enjoy a richer user experience. They find information faster, benefit from more personalized data, take advantage of automated processes (including the order process), and use different communication channels. As a result, the customer enjoys a richer offer of customized services.
Suppliers benefit from digitization and mobility to access a wider range of internal data and automate their logistics processes, which will improve operational excellence, reduce costs, and increase the levels of productivity, innovation, and creativity.
In the field
In any mobility project, it goes without saying that security is an important, if not crucial, dimension. Likewise, change management as well as user information and training need attention to ensure the widest possible adoption. And why not clean house before installing new mobile solutions?
To help companies succeed in their mobile transformation, Aprico respects and implements three basic rules: smart, lean, and agile. The smart approach aims at understanding the real added value of mobile technology for the organization and to set the business priorities to achieve the transformation. Aprico is open and independent. Its recommendations are based on the platforms and technologies that fit the client best. Aprico strongly believes in the lean improvement culture that involves planning the right resources at the right time and at the right cost. Finally, our agile methodology combines ongoing collaboration (involving stakeholders as early as possible) throughout the development process, rapid prototyping, and frequent testing (splitting the project into manageable entities resulting in regular delivery). Our approach reduces time-to-market, ensures easier change and maintenance, and ultimately improves customer satisfaction.
Are these diametrically opposed architectures incompatible?
The flexibility of microservices comes with an even bigger challenge regarding service-to-service communication. As the number of microservices grows, the death star anti-pattern lurks around. Service meshes were developed to address a subset of the inter-service communication problems in a totally distributed manner.
Quite some time ago, software vendors promoted a centralized hub-and-spoke middleware product for Enterprise Application Integration: the EAI broker (A).
Then SOA came along, and the tool of choice became the ESB.
But in many cases the ‘B’ was for ‘Broker’ instead of ‘Bus’, as most vendors just face-lifted their EAI products. It took another while to see a new kind of decentralized version of the ESB (B), but the damage was done.
A more decentralised approach
These days, vendors avoid the ESB label. In the current market, everything has to be distributed and flexible, while the ESB is seen as a heavy, monolithic and inflexible backbone.
These last years, the offering moved towards a completely distributed approach, where applications are seen as a collection of fine-grained, autonomous services, communicating through lightweight protocols and offering more flexibility and scalability: microservices (C).
The flexibility of microservices comes with an even bigger challenge regarding service-to-service communication.
As the number of microservices grows, the death star anti-pattern lurks around. Service meshes were developed to address a subset of the inter-service communication problems in a totally distributed manner.
Distribution can be challenging
But wait, how do we expose all those microservices’ public interfaces to the outside world? Through an API gateway. Centralized!
And how can we address the API consumers’ variety? Through BFF: Back-end For Front-end. Decentralized! Taking that perspective, it seems that not everything is meant to be distributed.
Due to the explosion in mobile device adoption, the rise of Internet business models, and SOAgility, companies have the opportunity to develop new customer and partner channels – and embrace the potential revenue growth that comes with them – through the API economy. Customer-facing and partner-facing engagement systems are the first to be impacted by the need for a dynamic integration style – and therefore the first where it makes sense to start working on.
At the same time, organisations that have invested significantly in EAI or ESB technologies are unlikely to replace those technologies with microservices in the blink of an eye. These tools will continue to play a critical role in enabling organisations to integrate their (legacy) systems (such as ERP and mainframe) and applications effectively for quite some time to come.
These technologies are still well suited to manage the integration of a few closely-aligned teams working with a predictable technology stack. In this sense, they are effective application integration platforms rather than something that can work across a wide and diverse enterprise.
As a conclusion, if the one-style-fits-all architecture is not yet conceivable, both architecture types will inevitably exist together. But then again, the microservice architecture approach offers a good opportunity to try out new business models and at the same time assess the existing ones.
The new landscape encourages reflection. Which parts of the environment do you still want to or have to operate centrally? Mission-critical core services are changing rather infrequently, indeed. But still, why wouldn’t you want to improve their actual set-up?
And of course, your IT environment has more than mission-critical core services alone. Where does it make sense to disrupt the existing models, put a fail-fast strategy in place and create so-called edge services?
Questions? Don’t hesitate to get in touch. Aprico develops and implements innovative ICT solutions that improve productivity, efficiency, and profitability, applying cutting-edge expertise and perfectly mastering both the technological and business aspects of clients in all industries. Aprico enables clients to be more efficient and to take up technological challenges, including SOA and MSA.
Making technology work for the business – and not the other way around
The emergence of DevOps, self-service and digital: these are the three major developments that drive the IT Service Management (ITSM) market in 2018, according to Forrester. ITSM aims at integrating processes, people and technology, in order to support the business objectives of the organization.
IT Service Management aims at providing IT services in a more cost-effective and result-oriented way, while responding more quickly to changing needs and offering major benefits in terms of efficiency, agility, availability, security, and compliance.
DevOps
“DevOps leads to innovation in several key areas of IT Service Management, including change management and release management”, says Doug Tedder, senior consultant at Tedder Consulting. Charles Betz of Forrester Research shares this view. He adds that business adoption of DevOps clears the road for more automation in change management, while development teams broaden their roles and share risks with the whole development chain, and even with operations. In addition, DevOps helps reduce the red tape linked to the deployment of changes, especially since these changes are more limited and more in line with the reality of the project.
“Because teams know that changes will be minimal and a correction can be implemented quickly, they are more willing to take risks, while the infrastructure becomes more resilient”, Betz added. Beyond DevOps, a robust and efficient infrastructure, as well as experienced teams in both development and management are essential for the implementation of an effective IT Service Management strategy. “In recent years, DevOps projects have mostly focused on development teams”, says Stephen Elliot, VP at IDC. “But in the coming years, infrastructure teams will have to be more involved, because DevOps allows improvements in terms of speed and quality. Benefits include rapid deployment of projects at business level and higher quality in terms of user experience, satisfaction, and renewal rate.”
Self-service
For Forrester, self-service is the other major trend in the IT Service Management market. The evolution is confirmed by ITSM solution vendor BMC Software, which estimates that 83% of IT organizations have deployed or support self-service tools for IT users. It’s an approach that not only allows users to better control their own ‘experience’, but also to share their feedback and find solutions to their problems.
“Every ITSM vendor aims for an early warning system, allowing customers to be aware of issues before they impact the user experience”, Elliot added. “To do this, suppliers develop analytical and automation tools as well as intelligent warning systems to achieve the highest level of self-service possible.” But even then, self-service should not be regarded as a technology, but as a tool, the success of which depends largely, if not totally, on its acceptance by the end-user.
Digital transformation
Digital transformation is the third major driver in the ITSM market. Digital transformation embodies a transformation that, at IT level, is not limited to the objective of aligning technology with the business. Digital transformation aims at integrating technology into the business, allowing IT to become an integral part of the operations, as opposed to being nothing more than a service provider.
“Digital transformation is not about technology”, says Stephen Mann, principal analyst and content director at ITSM.tools. “It’s a matter of business improvement. It is also as much a transformation of people as it is of technology. And we should not only look at new products, services or a more direct engagement with customers, but also at more efficient back-office operations.”
From IT Service Management to ESM
To make the whole organization digital, the trend is now to roll out service management in other entities of the company – hence the term ESM or Enterprise Service Management. ESM includes support services such as human resources, quality, and facilities. In doing so, each employee can use (digital) technology to access different services in self-service mode. In fact, many ITSM tools are evolving into ESM platforms. Overall, the benefits of ESM include better service delivery for employees, more efficient internal processes, and tighter integration and management of different services.
Since access to ITSM and ESM tools is becoming more and more mobile, a company could consider using the tools as a service in SaaS mode (Software as a Service), or even outsource the entire activity to a third party.
At Aprico Consultants, we combine cutting-edge expertise and the perfect mastery of both the technological and business aspects of our clients, regardless of their sector of activity, to develop and implement innovative ICT offers which improve our customers’ productivity, efficiency, effectiveness, and profitability. This way, we enable them to face the current technological challenges.
Thanks to video games, the general public got to know virtual reality first. Today, virtual reality also starts making waves in the industry, with a number of really practical and value-added applications for companies.
A promising market
First of all, it is really important not to confuse virtual reality (VR) with augmented reality (AR). AR uses a virtual interface, in 2D or 3D, which enriches reality by adding extra information. In practice, AR has the real world as its starting point, before embedding virtual objects, animation, data, sounds, and more, which the user then visualizes on a terminal (smartphone, tablet, glasses, or helmet). Virtual reality has a different approach. It offers a solution for digitally simulating an environment, with the possibility of virtually experiencing that environment through several senses (primarily vision, but also smell, hearing or touch).
According to a study by consulting firm Hampleton Partners, the global market for VR and AR will grow from $ 6 billion in 2016 to some $ 178.8 billion by 2022. As the AR market will experience an accumulated annual growth of 44.5% to $ 17.8 billion, the VR market will explode to a staggering $ 161 billion by 2022, a growth of no less than 85.4%. Hampleton Partners added that the growth will not only focus on the leisure sector, but also on health care, manufacturing and distribution, mainly for training purposes.
From the general public to the professional market
VR is nothing new. The technology’s roots date back to 1956, to the start of Morton Heilig’s Sensorama system. However, it was only in the 1990s that the first VR helmets for the general public appeared on the market. Meanwhile, products such as Oculus Rift and GO, HTC Vive, Samsung Gear, Playstation VR and Google Cardboard have indeed won over the consumer through video games. But other industries are starting to take an interest in VR as well. As a result, museums, for example, now offer visitors a new type of experience, with so-called 360° immersive solutions.
Companies are also beginning to take more interest in the VR phenomenon. For example, manufacturers use VR to promote their brands or marketing events, or to support product presentations and product launches. Using VR, it is possible to present a product that is not yet physically available or to stage it in a more spectacular way. Similarly, the real estate industry uses VR to provide virtual tours of building projects, based on 3D modeling, virtual and 360° images, and new and more user-friendly interfaces. In industrial environments, VR allows partners and customers to take a virtual tour of production facilities, when safety and hygiene conditions do not allow physical visits. Some businesses offer these virtual tours because they prefer to be open and transparent about their manufacturing facilities.
Without a doubt one of the most promising areas for VR is providing assistance to training or maintenance. Indeed, the technology makes it possible to integrate industrial processes and machines into a tool that can help the technician to work in a virtual way, before he or she starts a physical intervention on site. What’s more, VR makes it possible to simulate different scenarios (such as incidents, anomalies, and risks), allowing the best possible preparation for the actual field work. And as is often the case with technological innovation, the sky is the limit. Imagination is in charge, ready to design new applications thanks to ever cheaper and more efficient technology.
Integration
A study by the Capgemini Research Institute revealed that 82% of companies that have implemented a VR or AR project reported results that were satisfactory or even exceeded their expectations. Some 75% of companies with VR or AR experience reported operating profits that were 10% higher than expected. It should also be noted that 50% of companies that have not yet deployed these technologies are planning to do so soon, while 46% of respondents indicated that they want to use VR or AR within three years.
To have a successful VR project, the challenge is to combine and integrate different technologies to offer the user a new and especially ‘natural’ experience. At the same time, innovation is required as VR products are at different stages of maturity. And in this area too, collaboration beyond the silos of the organization is essential to remove the boundaries between VR and the physical environment.
Convergence
In the long run, VR and AR will come closer, resulting in ‘mixed reality’ or ‘augmented virtual reality’ by combining both virtual and real-world elements. This way, engineers will work in real time on the same 3D object in a room, collaborating with other participants remotely, without using a specific device. Indeed, the elements of interaction will improve (including haptic feedback, such as the ability of a bare hand to feel a change of temperature or texture), all associated with voice control, air clicks or eye-controlled command. Hampleton Partners concludes: “The growth of the VR and AR industries will be boosted by the integration of technology into industrial and manufacturing processes. […] The reality is that many businesses now need a full AR/VR strategy to ensure they are not left behind.”
Partnership
Aprico develops and implements innovative ICT solutions that improve productivity, efficiency, and profitability, applying cutting-edge expertise and perfectly mastering both the technological and business aspects of clients in all industries. Aprico enables clients to be more efficient and to take up technological challenges, including virtual and augmented reality. To enhance the value of digital transformation projects, Aprico collaborates closely with its clients in an open and transparent way, with full access to all project information.
To guarantee operational excellence to its clients, Aprico Consultants developed a selection process for new hires based on a simple principle: we are looking for employees who are not only the most competent, but also able to deliver.
In the IT industry, the war for talent has been raging for many years. But when there is a severe shortage of competent staff in the market, it is important – more than ever – to maintain the highest level of expertise among our own team. By combining cutting-edge expertise and perfect management of both the technological and business aspects of our clients’ needs, Aprico develops and implements an innovative ICT offering, which improves the productivity, efficiency, and profitability of our customers, enabling them to perform better and to cope with today’s technological challenges.
To provide our clients with the best IT profiles, ready to help them carry out their IT strategy, we have based our recruitment process on the principles that can be found in ‘Smart and Get Things Done: Joel Spolsky’s Concise Guide to Finding the Best Technical Talent’. At Aprico, we have ambitious goals. We not only want to find new employees with the best competences, but also people who are ready to fully commit themselves to our company.
In reality, a candidate starts the selection process by solving a ‘business puzzle’. It’s an exercise that doesn’t require deep knowledge of neither the .NET framework nor a particular business domain, but that simply aims at assessing the candidate’s capacity of reasoning and algorithm and code design. In less than 30 minutes, the candidate has to complete the exercise. Unit tests are provided. In the next step, the candidate is invited to discuss the techniques that were used with one or two Aprico employees, in general senior profiles. The objective is to learn – apart from his or her technical skills – how the candidate thinks, analyzes and understands a problem. But obviously, the solution is more important than the problem. That’s why we want to learn more about the candidate’s way of structuring solutions and expressing himself. If necessary, the candidate receives some clues that help him on his way.
The technical exercise is only the starting point. In the next step, we start a conversation with the candidate to try and dig as deep as possible into his or her skills and knowledge. The conversation is a living thing, but still we follow a certain canvas, to make sure we cover all the most important areas, including security, design, software engineering practices, testing, and various technology stacks. The conversation generally lasts one hour. The information is consolidated into an appreciation sheet that is shared with the HR department and that supports our ‘decision’. Of course, we also discuss our point of view with our HR colleagues.
For Aprico, the selection process shows also how we master the technological field in question. It reveals that we only are looking for the very best profiles for architecture, design and patterns, security and software engineering, including agile methods, integration and various technological stacks (WebApi, ASP.NET MVC, Angular, Data Persistence, WCF, and more). The decision to retain a candidate is made unanimously by both interviewers.
“Overall, we are less interested in the candidate’s résumé than in his ability to deliver results and in the way this was achieved”, says François Chabot, Solution Architect and Microsoft Specialist. “In fact, the test we developed is really very simple. But we expect a solution that is extremely precise, similar to the work of a goldsmith.” Aprico is mainly looking for candidates with a certain level of experience, “but not necessarily for the proverbial five-legged sheep.”
Aprico’s mission is to develop and implement end-to-end ICT solutions that translate a company’s strategy, objectives, and constraints into pragmatic transformation programs, delivering real added value and a proven return on investment in a market that is in constant evolution and transformation. Our requirements in terms of values and culture, coupled with a close collaboration with the client’s business and technical teams, enable us to offer tailored services and achieve the operational excellence required by our projects.
When you – like 60% of all companies – are planning an IoT project, be aware that integration is the main challenge. And as the IT and business capabilities you need are rare, you really can’t go without a technology partner.
Tomorrow’s world is a connected one. Almost all objects will be intelligent and will communicate through the Internet of Things (IoT). Of course, we need high-performance and reliable infrastructure and platforms to collect data and to run the solutions that process it. In all this, we cannot deny the importance of interoperability between these elements. In the IoT world, integration is a key success factor.
The possibilities are gigantic
According to market research by Bain & Company (‘Unlocking Opportunities in the Internet of Things’), the overall IoT market – including hardware, software, integration and telecom – will be worth 520 billion dollars by 2021. That is twice the size of the market in 2017. Researchers at McKinsey even value the IoT market at 11.000 billion dollars by 2025.
The numbers prove that there is a lot of demand in the market. 60% of companies have plans to start IoT projects. Two years ago, that number was only 40%. But there is a downside to the increase in demand as well. The new plans are mainly for smaller projects.
According to Gartner, B2B drives the IoT market’s current value, but the big volumes can be found in B2C. In 2017, the general public’s objects – including cars, TV sets and home automation – represented some 63% of the market. On the other hand, companies are responsible for the main share of IoT investments: 964 billion dollars, compared to 725 billion in the consumer market. In the professional market, all industries will be impacted by IoT, even when today’s applications – often described as ‘smart’ – primarily focus on cities, health care, production and logistics, home automation, transportation and buildings.
Attraction
Several factors explain the current craze for IoT. The production cost of the new generation of sensors has dropped considerably, as a result of the increase in mass production. At the same time, the devices are gradually becoming more energy efficient. In addition, the underlying infrastructure is in full development, be it the cloud, essential to collect and process these gigantic data volumes, or broadband networks such as 2G, 3G and 4G. Of course, we also need to mention the rise of WiFi and Bluetooth networks, as well as more dedicated networks like LoRa or Sigfox, and fixed infrastructure that is compatible with IoT. An enabling element is also the fact that the cloud has become more elastic and is perfectly scalable to fit a company’s needs, both in functionality and in cost.
Lower hardware costs, both in terms of storage and information processing, offer another accelerator of change. The internet is becoming more and more important in the industrial world, as is demonstrated by the growing number of Industry 4.0 projects.
As a result, an increasing number of vendors is providing specific platforms for the development of custom IoT solutions, for the use of existing applications, as well as for the integration of IoT with internal company processes.
Challenges
Basically, an IoT project combines three elements: the data, captured by sensors; the process (automated or not) to collect the data, analyze it and initiate an action; and the user who interacts with the sensor and the process to track information in real time and control it remotely.
As a result, there are three major lines of development. First of all, the deployment and (remote) management of sensors. Secondly, connectivity management to allow the visualization of the sensors and their status using a synoptic dashboard. Finally, the development of applications that generate actions and integrate the collected data in a process to create new business models or improve existing services. The value of an IoT project is created either by transforming and optimizing an existing business process, or by promoting the creation of innovative businesses, products or services, McKinsey confirmed.
But the major challenge of IoT is probably interoperability, said McKinsey, which is needed to fully unlock the potential value of a solution. It is clear that the current generation of solutions is rather vertical, built for specific markets such as automotive, utilities (smart meters, for example) or health care. So far, there is too little sharing of data, either internally within the organization, or externally with service providers who could exploit the data further and monetize it.
Integration
Whoever says ‘interoperability’, also has to say ‘integration’, in the broad sense of the term. Obviously, we talk about network connectivity here, and about the ability to transmit data (often very small data, but in extremely large volumes) in real time. As a result, a solution can’t exist without maximum availability of both sensors and transmission, with latency as low as possible.
In addition, it is also a question of integrating the various hardware elements (sensors, gateways, and more) not only into the backbone that is responsible for collecting and processing the data, but also into the overall (legacy) infrastructure. Similarly, it will be necessary to ensure compatibility between various components and firmware. At this infrastructure level, standardization is still in its infancy, particularly when it concerns M2M protocols.
Security is one of the obvious challenges as well, knowing that the internet is particularly vulnerable to hacking and other cyber-attacks. And here too, integration with security tools will be paramount.
Partnership
With the shortage of IT skills in mind, it is clear that the success of an IoT project depends on an ecosystem of partners, including a service provider that is able to understand the various IoT technologies, but also to align the project with a business strategy that can bring added value.
Specializing in digital transformation solutions, Aprico Consultants solves the problems that customers encounter today by applying the technologies of tomorrow. The Internet of Things, especially, offers an approach that is likely to answer many business problems or open perspectives that have remained unexplored until now.
AI: it’s now or never!
Data is the new oil. Clever use of data adds an obvious competitive advantage to your company’s digital transformation. Large companies have already understood this for quite some time. But also SMEs can integrate artificial intelligence into their business processes to further exploit the data they own – as long as they adopt a clear strategy first.
Tidal wave
After the cloud and big data, artificial intelligence appears to be the next concept in vogue. Some figures clearly illustrate the magnitude of the phenomenon. According to a study by Vanson Bourne in July 2017, 80% of companies are already using some form of artificial intelligence (machine learning or deep learning, see below), while 30% of companies plan to increase their investment in AI in the next 3 years. In addition, 62% of companies plan to appoint an AI Director in the future.
Overall, according to the survey, companies are thinking about the deployment of AI in data transmission, intelligent workflows and decision support processes, as well as in large-scale analytics. The only barriers to a wider implementation of AI in the business, are a lack of IT infrastructure and a shortage of skilled IT staff.
Definitions
First of all, we must agree on the definitions we use. According to Cigref (the French network of large companies), AI is “the capacity of a functional unit to perform functions generally associated with human intelligence, such as reasoning and learning.” In practice, AI covers two main areas: machine learning and deep learning. According to Cigref, machine learning “combines algorithms that learn from examples and from data.” In other words, machine learning makes predictions based on data sets. Deep learning is all about a technology’s ability to learn from raw data.
AI also includes other technologies, such as symbolic AI, logic programming, and rules engines. Since all of this has to do with learning processes, it is necessary to feed the systems first, before training them in the context of multiple iterations, which requires the use of data scientists.
Applications
As a first step, AI targets business areas such as marketing, maintenance, logistics, control, human resources, and customer relations. In other words: all sectors where data volumes are (very) important and where models, trends, and repetitive operations can be identified through analytics and information processing. The evolution consists of moving from (human or automated) programming to learning, with systems that gradually become self-learning. Please note that a human no longer needs to understand a phenomenon or process to teach it to a machine, as it swill seek and find the right solutions.
At the moment, it’s still a dream to think that machines will become really intelligent – according to some sources even more intelligent than man. However, computing power – especially neural computing – can significantly improve the speed and volume of data processing, provided you have quality data, accurate algorithms, and specialized human knowledge. Because in the end, the quality and nature of the data largely determine the technological choices.
Overall, the company will have to choose between either contacting an external partner who will develop a custom algorithm, or investing in existing AI solutions that the company will buy or rent as SaaS (Software as a Service). Similarly, APIs (Application Programming Interfaces) can be developed to access and use data in AI applications automatically and transparently.
AI for all
That said, you don’t need to be called Amazon, Uber, Netflix or Airbnb to embark on an AI track. Data processing and data analysis are not the exclusive domains of data scientists from large organizations. Data storage infrastructure and analysis tools are available for everybody in the cloud. At the same time, harnessing the potential of AI requires not just a strategy, but also budget and time. Thanks to AI, small and medium-sized companies will be able to perform complex tasks more efficiently and at a lower cost. It will allow them to develop new business models, but also to consider new activities – completely new or derived from existing products.
At the end of the day, AI leaves no room for improvisation. AI requires not only technological knowledge, but most and foremost a perfect understanding of the customer’s business. Who else but your trusted IT partner understands your strategy, needs, and goals? Aprico Consultants assists, guides, facilitates, and coordinates your projects to enable faster implementation, generate efficiency and reduce costs.
More info about AI: sales@aprico-consult.com