Case Study 02 · Microsoft · Enterprise Internal Tool

Microsoft
GOLocal M365.

I led the end-to-end design of GOLocal Buildout — a unified deployment tracking tool that gave Microsoft's global M365 team real-time visibility across 40+ countries, replacing Excel sheets and 4 fragmented tools with a single source of truth.

RoleProduct Designer
PlatformWeb · Internal Tool
Design SystemMicrosoft Fluent UI
Team13-person cross-regional
40+
Countries Tracked in Real Time
from Excel spreadsheets and 4 disconnected tools to one unified deployment dashboard
0
Design Debt Requests Post-Launch
self-sufficient handoff documentation — engineering team never needed to chase the designer
0
Major Flow Changes in Development
storyboarding before Figma and testing with real data eliminated all development-phase surprises

I led the end-to-end design of GOLocal Buildout for Microsoft's M365 global deployment program. A 13-person team responsible for expanding cloud infrastructure into 40+ countries simultaneously was managing everything in Excel spreadsheets and 4 disconnected tools. I reframed the problem from a data visualization task to a situational awareness challenge — and designed a unified deployment dashboard that gave the team real-time visibility, reduced decision latency, and eliminated the need for daily manual data consolidation.

I owned this project end-to-end as the sole Product Designer — from research and problem framing through to final handoff and post-launch support.

I led AEIOU observational research and stakeholder interviews
I defined the information architecture and dashboard structure
I facilitated cross-timezone ideation sessions
I storyboarded every complex interaction before Figma
I iterated through 4 versions with real deployment data
I delivered annotated handoff documentation with zero design debt

Sole designer on a 13-person cross-regional engineering team.

One team. 40 countries. Zero unified visibility.

A 13-person team responsible for expanding Microsoft's M365 cloud infrastructure into 40+ countries simultaneously — Germany, Israel, Poland, Mexico, and dozens more. Each market with its own data center requirements, local compliance rules, and workload dependencies. They were managing all of it in Excel spreadsheets and 4 disconnected tools.

The moment I knew: When I mapped the full deployment journey — from business evaluation through data center planning to market launch — I identified 7 handoff points where information was supposed to move between people. At 5 of them, it got lost, delayed, or duplicated. I reframed this immediately: that was not a process problem. That was a design problem.

7
"There were 7 handoff points in the deployment workflow where information was supposed to move between people. At 5 of them, it got lost, delayed, or duplicated. That was not a process problem. That was a design problem."
— Mahesh · Journey mapping session, GOLocal Buildout
Excel as a Dashboard
Deployment tracking lived in manually maintained Excel sheets. Every update was manual — error-prone, time-consuming, and always stale by the time it reached decision-makers.
→ Hours lost weekly to consolidation that should be automatic
Fragmented Tool Stack
Data lived across SharePoint, Visual Studio, SQL, and Power BI with no unified view. A complete picture of one country required checking four tools and reconciling what you found.
→ No single source of truth across a 13-person cross-timezone team
No Real-Time Status
Bottlenecks were discovered after delays had already cascaded through dependent workloads. A blocked deployment in Poland surfaced days after it could have been caught.
→ Reactive firefighting instead of proactive management
Compliance Risk
Restricted markets — public sector, healthcare, finance — had unique compliance requirements per country. Without centralized tracking, compliance knowledge walked out the door when people changed roles.
→ Compliance gaps invisible until audit — too late to fix cheaply
⬡ client confidential
Current vs Proposed Approach
Current vs Proposed — Manual Excel vs Unified Dashboard
⬡ client confidential
ADR tracked in Excel
Mexico GoLocal ADR — tracked manually in Excel
⬡ client confidential
DSP tracked in Excel
Mexico GoLocal DSP — deployment status in spreadsheet, updated manually

One tool. Three user types. Very different needs.

The GOLocal team wasn't a monolith — I identified three distinct user groups with different information needs, different workflows, and different tolerances for complexity. Designing for one without understanding the others would have produced a tool that worked for nobody.

Deployment Engineers
Need granular workload-level status — EXO, SPO, Teams — for each country. Their job is execution and unblocking. They need to see exactly what's stuck and why, without digging through reports.
→ Need: drill-down workload status per market
Local Market Leads
Need a high-level view of their specific country — timeline, compliance status, blockers. They're accountable to regional leadership and need to self-serve answers without pinging the central team.
→ Need: country-level summary with compliance visibility
Support & Operations Staff
Need to track compliance requirements for restricted markets — public sector, healthcare, finance. Their biggest risk is compliance knowledge walking out the door when people change roles.
→ Need: centralized compliance tracking per market type
The Key Insight
All three user groups needed different views of the same underlying data. A single dashboard layout would fail all three. The architecture decision — not the visual design — was what made this work.
→ Solution: role-contextual views from shared data model

AEIOU framework. Observe before designing.

01
AEIOU Observational Research — Context You Cannot Get From Interviews
Applied the AEIOU framework (Activities, Environment, Interactions, Objects, Users) because it captures the full context of how a team operates — including the workarounds, informal communication, and moments where the official process diverges from reality. I needed to observe people working, not hear them describe idealized versions of their workflow. The 7 handoff failure points were found through observation, not interviews.
02
Stakeholder Interviews + Journey Mapping
Interviewed each team member individually — not in groups, because group interviews produce consensus answers. Built user journey maps for each of the 3 user groups (deployment engineers, local market leads, support staff) tracing how a deployment request moved from business evaluation through to market launch. All 3 user groups needed different views of the same data — designing for one would fail the other two.
03
Cross-Timezone Ideation — Preventing Groupthink Across USA and India
Facilitated ideation sessions with engineers in both the USA and India. Silent brainstorming first — everyone writes independently before anyone speaks. Prevents the loudest voice anchoring the group, especially important in cross-cultural sessions where deference to seniority can suppress good ideas. From 40+ ideas, dot voting narrowed to 8 viable concepts. Insight: the team needed cockpit-style situational awareness — not more detailed reports.
04
Storyboarding Before Figma — Zero Major Flow Changes in Development
Storyboarded every complex interaction flow before opening Figma. On a data-dense enterprise tool this is not optional — it forces edge case thinking before prototype investment. What does "blocked" mean for a market waiting on compliance vs. one waiting on infrastructure? Each storyboard reviewed with a team member before moving forward. Result: zero major flow changes during the development phase.
⬡ client confidential
ADR Migration Status Report
ADR Migration Status Report — Power BI · Before GOLocal
⬡ client confidential
FFP Capacity Demand Israel
GoLocal FFP Capacity Demand — Israel · fragmented across tools
⬡ client confidential
FFP Capacity Demand Switzerland
GoLocal FFP Capacity Demand — Switzerland · no unified view
⬡ client confidential
GOLocal in Figma
GOLocal Buildout — Figma workspace · design system applied

Five decisions that changed how the tool works.

I wasn't trying to optimize everything at once. I focused on the decisions that would remove the biggest sources of friction — each one made from evidence, not assumption.

01
Cockpit awareness over detailed reports
The team's insight during ideation: they needed situational awareness, not more data. I designed the primary view around status signals — not tables and rows. A deployment engineer needs to see at a glance what's blocked, what's on track, and what needs intervention. I prioritized scannable status over completeness.
02
3-state badge system replacing 6
V2 testing revealed the 6-state badge system caused consistent confusion — users couldn't interpret the difference between adjacent states without reading the legend. I made the call to collapse to 3 primary states: On Track, At Risk, Blocked. Reducing options improved decision speed. Every removed state was a cognitive load reduction.
03
Compliance centralization as a first-class feature
Compliance requirements for restricted markets (public sector, healthcare, finance) were stored in individual team members' heads. When people changed roles, that knowledge walked out the door. I elevated compliance tracking from a secondary concern to a core dashboard feature. Institutional knowledge needs to live in the system, not in people.
04
Storyboarding every complex flow before Figma
On a data-dense enterprise tool, storyboarding is not optional. What does "blocked" mean for a market waiting on compliance vs. one waiting on infrastructure? Each storyboard reviewed with a team member before moving forward. Result: zero major flow changes during the entire development phase.
05
Fluent UI — working within the system, not against it
I mapped every component to Microsoft's Fluent UI design system before any custom UI work began. On an internal Microsoft tool, consistency with the broader product ecosystem wasn't optional. I documented every component decision and edge case — the engineering team never needed to chase me for answers post-handoff.

What I chose not to solve.

Scope discipline is a design skill. These were deliberate calls — not gaps.

Clarity over feature richness
Stakeholders wanted more data surfaces and filter options. I prioritized a clear primary view over a configurable one — a tool that requires setup before it's useful has already lost the user.
→ Deferred: advanced filtering, custom views
3-state simplicity over 6-state precision
The 6-state system was more accurate. The 3-state system was actually used. Precision that creates confusion produces worse outcomes than simplicity that drives action.
→ Deferred: granular status distinctions
Foundation over personalization
Personalized dashboards per user type would have added significant value. But only after the core shared model worked reliably across all three user groups. I deferred personalization to v2.
→ Deferred: role-specific dashboard customization
Automated data over real-time sync
Full real-time API sync with all four source tools would have extended the engineering timeline significantly. I designed for structured data import that eliminated manual consolidation — without requiring live integration in v1.
→ Deferred: live API integration with all source systems

Four versions. Each one earned.

I did not jump to high fidelity. On a data-dense enterprise tool, skipping low-fidelity work is how you build a beautiful interface that does not fit the information it needs to display. Each version was a direct response to what the team and the data told me was wrong with the previous one.

Version 01Lo-Fi · Structure First
⬡ client confidential
V1 iteration
"I need to see all workload status at a glance — not drill into each one."
Version 1.2Structure Refined
⬡ client confidential
V1.2 country view
Nav hierarchy validated · Status badge system needs rethink
Version 02Mid-Fi · Real Data Testing
⬡ client confidential
V2 tracking status
6-state badges caused confusion → simplified to 3 primary states
Version 03 → Final✓ Shipped
⬡ client confidential
V3 final
"This is exactly what we needed. Can we just ship this?"
⬡ client confidential
Final — Summary view
GOLocal Buildout — Summary view · Shipped ✓
⬡ client confidential
Final — Country tracking
GOLocal Buildout — Country tracking with ADR Engineering Readiness
⬡ client confidential
Final — ADR Readiness
GOLocal Buildout — BareMetal RTEG Status · real-time per market
⬡ client confidential
Final — DSP Status
GOLocal Buildout — DSP Deployment Status table · production view

Forms Migration — Tenant Onboarding

Alongside GOLocal Buildout, I designed a second internal application within the same Microsoft M365 engagement: the Forms Migration app. This tool was used to onboard corporate clients into specific countries — managing the tenant migration process from application through to go-live.

The core workflow: add a tenant, configure their GEO, set user counts, confirm migration details, and initiate the migration. Before this app, the team tracked tenant onboarding in Excel with no standardized flow and no confirmation states.

⬡ client confidential
Forms Migration before — Excel
Before — Tenant migration tracked in Excel spreadsheets
Forms Migration Application — Final
Forms Migration Automation — Tenant Tracking List · Shipped ✓
Add a Tenant
Simple form — Tenant ID, GEO selection, user count. I designed this to be completable in under 60 seconds with no ambiguity in field requirements.
→ Tenant ID · Current GEO · Target GEO · User Count
Start Migration
The migration initiation flow included a mandatory sign-off confirmation — a deliberate friction point to prevent accidental starts on high-stakes tenant migrations.
→ Confirmation gate prevents accidental migrations
Design principle
Both forms were designed within Microsoft Fluent UI — consistent with GOLocal Buildout and the broader internal tooling ecosystem. Zero custom components.
→ Fluent UI · Moray UI Library · System-consistent
Outcome
Replaced a manual Excel-based tracking process with a structured, auditable onboarding flow — reducing errors in tenant configuration and giving the team a clear migration state at every step.
→ Structured flow · Auditable states · Zero config errors

Excel spreadsheets to a live global dashboard.

Before
The old wayExcel deployment tracking · 4 tools to check for one country status · 6-state badge system users could not interpret · Blockers surfaced days after they should have been caught · Compliance knowledge stored in individual heads · Hours of daily data consolidation per team member
After · GOLocal Buildout
What changedSingle source of truth across 40+ countries · Real-time deployment status · 3-state badge system interpretable without explanation · Bottlenecks visible before delays cascade · Compliance requirements centralized · Decision-making in minutes, not hours

13 people. 40 countries. One tool that works.

40+
Countries Tracked in Real Time
From Excel and 4 disconnected tools to one unified dashboard — EXO, SPO, and Teams deployment status visible at a glance across every market.
0
Design Debt Requests Post-Launch
Annotated component specs, edge case documentation, and a self-sufficient handoff package meant the engineering team never needed to chase the designer for answers.
0
Major Flow Changes in Development
Storyboarding before Figma and testing with real deployment data — not placeholder content — eliminated surprises during the build phase entirely.

What I confirmed. What I'd do differently.

01
What I'd do the same
The AEIOU observational research was the highest-leverage investment. The 7 handoff failure points were found through watching people work — not through asking them to describe their work. Observation before interviews, always. And storyboarding before Figma eliminated an entire category of development-phase surprises.
02
What I'd do differently
I would involve the support and operations staff earlier in the IA phase. Their compliance knowledge shaped the architecture late — if I'd brought them in at the start, the compliance tracking feature would have been better scoped from day one rather than refined mid-iteration.
03
What this project confirmed
Enterprise internal tools live or die on information architecture, not visual design. The GOLocal team didn't need a beautiful dashboard — they needed one that made the right information instantly readable under real pressure. Every design decision traced back to a real moment where the team was failing because the information wasn't there.
Designing for enterprise is not about making things beautiful — it is about making complex information instantly readable for someone who is under pressure and needs to act fast. Every decision in GOLocal traced back to a real moment where the team was failing because the information was not there.
— Mahesh Guntivenkata · Microsoft GOLocal M365
PreviousAll Projects
Next Case StudyWalmart IBG →