Migrate your work environment to Office 365 with confidence

Recent events have shown us that teleworking is no longer a luxury for employees, but a real necessity to ensure the continuity of organisations’ activities.

For those who have not yet taken the plunge (mainly ETIs and the public sector), it is essential to start thinking about Cloud collaboration and communication platforms as soon as possible. This, in order to be able to ensure continuity of service in case of force majeure (cyber attack, natural disaster or even pandemic), or even to envisage a more consequent migration.

For this Digital Workplace platform, a close collaboration between the security team and the workplace will be a prerequisite!

In this article, I will share with you some feedbacks on the deployment of Office 365, Microsoft’s solution that is becoming increasingly popular with the companies we support.

There is a lot of interesting documentation on the subject on the Internet (“Top 10 best practices” or “3 good reasons to connect the xxx application to ensure your security”). Microsoft summarizes some of these good practices in these two articles:

Today, I am not going to repeat here a non-exhaustive list of these good practices, but rather to remind you of six points of attention when opening such a service.

1st point: Building the safety standard, a pillar of the future relationship between the safety and workplace teams.

As with any project of this type, the first step is to assess the potential of the service and see how it can meet the initial need, through the development of a business case. The possibilities offered by Office 365 are numerous: office automation, instant messaging or email, data visualization, development of applications without code, etc.

As far as cybersecurity teams are concerned, there are two choices: to oppose this migration because of the risks linked to the American Cloud or to support the reflection to create new secure uses.

In the vast majority of cases, the second choice is preferred. A tripartite relationship then begins, between the workplace teams, security and architects, with the aim of building a service for the users. A result of this step could be the development of a security standard, resulting from a risk analysis, defining the services used and with the associated configuration.

Among the issues to be addressed are generally the following three themes:

  • What uses should be offered to people in a situation of mobility? With what authentication?
  • What new services to offer with the possibilities of integration with APIs?
  • How to share documents with external users?

The current trend is to provide answers with a “Zero Trust” approach. Any deviation from the defined safety standard will have to be detected, thanks to the implementation of dashboards and supervision. The adage “Trust does not exclude control” has never made more sense.

This reflection may even be an opportunity to ask fundamental questions in order to lay a coherent foundation for the working environment. For example, why leave email, a 30-year-old system, open to everything and externally block my Teams and SharePoint shares? Improving the user experience can only be achieved by standardizing security practices.

 

2nd point: Data protection, a subject with the wind in its sails

Parallel to the construction of the service, comes the subject of the data that will be used in the tenant. For this, two simple questions must find answers (often complex).

How do I protect my data?

Today, unstructured data protection strategies are based on a common basis: the linking of data to a level of sensitivity. This correspondence leads to protection measures to be put in place:

  • – Encryption with keys controlled by the CSP or the organisation;
  • – Restriction of rights (or DRM);
  • – Conditional access with multi-factor authentication;
  • – Data Leakage Protection (or DLP).

In order not to over-protect data and thus avoid undermining the user experience, encryption and rights restriction can be reserved for the most critical data. Other data will still remain under control using more traditional measures, such as end-to-end encryption and exposure control.

A key factor for such a project will be to turn it into a real business project, with a comprehensive awareness programme dedicated to classification.

How to remain compliant with the regulations?

An organisation may be subject to local, implementation-related and sector-specific regulations, depending on its activities.

These regulations and directives in some cases impose real obstacles that need to be removed at the outset of the project: data retention, legal archiving, geolocation, judicial investigation, requests related to personal data.

Let’s take a concrete example: Russia. With the law on personal data of 2015, the national regulatory authority imposes the obligation to keep the source (called primary database) of its citizens’ data on Russian soil. In practice, this means that the Active Directory (primary base of corporate identities) of the Russian entity must remain Russian. From there, the information can be synchronized with the GAL (Global Access List) and Azure Active Directory.

The thorny issue of stock management

What to do with the data already existing? This is a complex issue, especially if the opening of a Cloud collaboration solution is linked to the decommissioning of existing file servers.

First of all, there is a technical question. Will the company’s network be able to support massive migrations of .pst and documents? In particular, it will not necessarily be useful to migrate data that does not comply with the retention policy.

Secondly, historical data may have heterogeneous levels of sensitivity and be subject to various regulations. A trade-off will be necessary to arbitrate between local data retention, risk acceptance and a broad classification project before or after migration.

3rd point: The Target Operating Model, guaranteeing the preservation of security over time

The operational model of a service such as Office 365 defines the responsibilities of the players (administrators, support staff, etc.) and the principles of object management. It is complementary to the security standard mentioned above, providing an operational vision.

The TOM must be drawn up prior to the opening of the service and updated regularly. It must include at least the following subjects.

A model of administration

Microsoft offers by default about 50 administration roles, not counting the RBAC roles of services (e.g. Exchange and Intune). A relevant use of these roles and custom roles will help to avoid having too many General Administrators and to follow the principle of least privilege. The implementation of Just-in-Time access will moreover make it possible to monitor the actual use of roles, while reinforcing security.

A semi-architectural / semi-security community

Like any SaaS platform, Microsoft regularly upgrades the functionalities of its collaborative suite. The mission of this community will be to monitor trends, in order to master new uses and keep control of the tenant considering the evolutions.

The life cycle of shared identities and spaces

If shared spaces (Teams, SharePoint) are not managed freely, this can lead to an explosion in the number of spaces that do not comply with the security standard. The reports of the editors of Data Discovery solutions are quite striking. To avoid this, it is necessary to establish a life cycle for shared spaces. These rules can include a naming convention, retention policies, a lifespan, principles for rights management.

The establishment of a single portal for the creation of these spaces will make it possible to implement these good practices, while promoting the user experience.

Similarly, a life cycle for Azure AD objects (including guest users, security groups, Office 365 groups and applications) must be defined and equipped. Here are two examples that deserve to be addressed: the delegation of APIs is left open and leaves the door open to massive data leaks; users invited to collaborate are never deleted. For this, two strategies are possible:

If shared spaces (Teams, SharePoint) are not managed freely, this can lead to an explosion in the number of spaces that do not comply with the security standard. The reports of the editors of Data Discovery solutions are quite striking. To avoid this, it is necessary to establish a life cycle for shared spaces. These rules can include a naming convention, retention policies, a lifespan, principles for rights management.

The establishment of a single portal for the creation of these spaces will make it possible to implement these good practices, while promoting the user experience.

Similarly, a life cycle for Azure AD objects (including guest users, security groups, Office 365 groups and applications) must be defined and equipped. Here are two examples that deserve to be addressed: the delegation of APIs is left open and leaves the door open to massive data leaks; users invited to collaborate are never deleted. For this, two strategies are possible:

  • #1 – Creation of a Custom Automation Engine decorrelated from the IAM, via an in-house application developed in PowerShell ;
  • #2 – Integration of a Powershell / Graph API connector to the IAM solution in place in order to present a complete management of the objects, disregarding their direct hosting.

4th point: take a fresh look at the subject of user identity

Indeed, the subject of identity is a pillar of SaaS!  So, take the time to consider all the possibilities and risks of SaaS Identity Providers (or IdPs). In particular, it is unthinkable in 2020 to consider Azure Active Directory as a simple Domain Controller in the Cloud.

Three approaches are possible for the source of identities accessing Office 365.

The dissociation of identities, a quick-win but complicated from a user’s point of view

It is possible to dissociate the local and Cloud identities if the local DA is no longer available or to decorate the Cloud workspace from the historical IS. This scenario is obviously not in favour of an optimal experience, but may be a valuable asset in the event of a crisis.

The use of local identity in the Cloud, a classic strategy

In order to reconcile security and user experience, it is necessary to use the same identity between the legacy applications and this new service. For this, three technical scenarios are available:

  • Identity Federation : This historic solution is widely used by large French companies that are reluctant to host passwords in the Cloud and wish to have SSO;
  • Password Hash Sync (PHS): This solution, recommended by Microsoft and the British equivalent of ANSSI, is implemented by the vast majority of Microsoft customers. This solution can also be used as a back-up when the federation service is no longer available;
  • Direct Authentication (Password Through Authentication or PTA): This solution provides the best user experience but has the disadvantage of passing the password through Azure AD.

Migrating one’s identity repository to the Cloud, a longer-term vision

Before or after migration, it may be appropriate to consider fully migrating the source of identities into the Cloud (whether Azure AD or a third party solution), in order to take advantage of the new possibilities. There are still several prerequisites that need to be lifted, such as printer, GPO and terminal management.

5th point: Gradually open up services to encourage controlled adoption

It is always easier to open a new service than to go back for safety reasons. Massively opening the different services of the collaborative suite has the advantage of offering a maximum number of uses cases but can cause several side effects.

First of all, services that are not officially supported and left in the hands of users for testing purposes represent a definite risk. They need to be configured and hardened. In some cases, it may even be preferable to disable the corresponding licenses.

Secondly, a controlled launch of the tools will help control costs during the first months or years of the transition. As Microsoft licences represent a certain load, it is possible to optimize unused licences.

Change management is also a key aspect to consider; to promote the user experience, of course, but also to promote data security. It is essential to have a clearly defined roadmap and user journey. Accompanied adoption will lay the foundations for proper governance of shared spaces and data (both in terms of exposure and protection).

It will be useful to consider creating a community of evangelists and users in order to maintain momentum in the adoption of the new functionalities brought by Microsoft. A uservoice system could be an asset; the ideal would be to listen to the needs of users and prioritise future openings.

6th and last point: Licences, the lifeblood of Office 365 and its security

SaaS solutions are generally subject to a monthly invoiced licensing model. The choice of Microsoft 365 licences must be the result of a global reflection. It cannot remain the prerogative of workplace teams and be determined solely by the need for collaboration and communication.

Indeed, the choice of licensing level will condition the security strategy of the tenant. This choice will have a wider impact on the strategy for securing the work environment. Indeed, Microsoft is increasingly positioning itself as a challenger to security solution providers, being the only one to offer such a complete suite.

The licensing of security options must be dealt with at the start of the project and at each renewal. It will be cheaper to include a licensing package from the outset than to order AAD P1 licences on an emergency basis to cover an unforeseen need for conditional access.

In this strategy to be defined, it may be appropriate to target individuals to adapt the security requirements to their profile (VIP, admin, medical population, etc.).

This approach, presented here for Office 365, can be generalised to any SaaS (Solution as a Service) service, or even IaaS (Infrastructure as a Service) or PaaS (Platform as a Service) service.

 

Back to top