1. Home
  2. Companies
  3. GitHub
  4. Outage Map
GitHub

GitHub Outage Map

The map below depicts the most recent cities worldwide where GitHub users have reported problems and outages. If you are having an issue with GitHub, make sure to submit a report below

Loading map, please wait...

The heatmap above shows where the most recent user-submitted and social media reports are geographically clustered. The density of these reports is depicted by the color scale as shown below.

GitHub users affected:

Less
More
Check Current Status

GitHub is a company that provides hosting for software development and version control using Git. It offers the distributed version control and source code management functionality of Git, plus its own features.

Most Affected Locations

Outage reports and issues in the past 15 days originated from:

Location Reports
Brasília, DF 1
Montataire, Hauts-de-France 3
Colima, COL 1
Poblete, Castille-La Mancha 1
Ronda, Andalusia 1
Hernani, Basque Country 1
Tortosa, Catalonia 1
Culiacán, SIN 1
Haarlem, nh 1
Villemomble, Île-de-France 1
Bordeaux, Nouvelle-Aquitaine 1
Ingolstadt, Bavaria 1
Paris, Île-de-France 1
Berlin, Berlin 2
Dortmund, NRW 1
Davenport, IA 1
St Helens, England 1
Nové Strašecí, Central Bohemia 1
West Lake Sammamish, WA 3
Parkersburg, WV 1
Perpignan, Occitanie 1
Piura, Piura 1
Tokyo, Tokyo 1
Brownsville, FL 1
New Delhi, NCT 1
Kannur, KL 1
Newark, NJ 1
Raszyn, Mazovia 1
Trichūr, KL 1
Departamento de Capital, MZ 1
Check Current Status

Community Discussion

Tips? Frustrations? Share them here. Useful comments include a description of the problem, city and postal code.

Beware of "support numbers" or "recovery" accounts that might be posted below. Make sure to report and downvote those comments. Avoid posting your personal information.

GitHub Issues Reports

Latest outage, problems and issue reports in social media:

  • daredevil3x7
    P@ (@daredevil3x7) reported

    @ndrewpignanelli Looks dope! I just have an issue which blocks me to continue I also raised the bug already. I tried connecting my github but it shows 404 error page on github when I click "Connect"

  • nxtvoid
    notvoid (@nxtvoid) reported

    @github, how can I contact someone about an issue with my student benefits?

  • AccidentalCSA
    Accidental Chief Software Architect (CSA) (@AccidentalCSA) reported

    GitHub has become a problem. Going to begin scoping out a replacement. Doesn’t need to be feature packed, just needs to…. work…

  • OpenCoreVenture
    Open Core Ventures (@OpenCoreVenture) reported

    MCP defines how an agent connects to a system, reads its state, and writes back to it. Anthropic introduced it in late 2024. OpenAI, Google, GitHub Copilot, and Cursor have all adopted MCP. The public server registry has grown nearly 8x in the past year.

  • DissentingS
    DissentingSkeptic (@DissentingS) reported

    @IntCyberDigest Its changing how much human verification is needed from false positives. Ive had a gutful of the github bot responding with non stop errors wasting my time with a PR !

  • mattstauffer
    Matt Stauffer (@mattstauffer) reported

    OH keep telling me about how terrible GitHub is! 1 billion commits in 2025, aka 4 x 250 million. Meaning an entire *quarter* of commits in 2025 is equivalent to *less than a week* in 2026. Y'all, I deal with clients on GitLab, Bitbucket, etc. all day. I know. GH is the GOAT

  • PhreeStyleBTC
    PhreeStyle (@PhreeStyleBTC) reported

    Nearly filed an @opencode issue today because of @github. Signed up for @OpenAI, no issues. Copilot Pro+ subscription was my last real anchor to GitHub, and I'm thankful they forced me to correct that.

  • tetsuoai
    tetsuo (@tetsuoai) reported

    Grok Connectors quick rundown. xAI shipped native app connections. OAuth your Gmail, Calendar, Drive, GitHub, Notion, Slack, Linear, Microsoft directly into Grok. MCP for custom servers. Grok starts using your tools. Live data, scoped permissions, revoke anytime. Useful patterns: "summarize yesterday across email, calendar, notion" "open github issues assigned to me" "calendar this week, flag conflicts" "draft a reply to the last slack from x" The MCP side lets you plug Grok into any server speaking the protocol. Custom internal tools, your own infra.

  • bobcowherd
    Robert Cowherd (@bobcowherd) reported

    4/ GitHub is the most recoverable, but only partly. Source survives in any local clone. That's the good news. What doesn't survive: Issues, Actions secrets, branch protection rules, deploy keys, webhooks. Gone with the repo. 90-day soft-delete is best-effort, not contractual.

  • fforres
    fforres (@fforres) reported

    Been days since we cannot use claude code because @claudeai cannot handle us enabling SSO in github. Repos don't show up (and the Fin AI agent is a useless loop) Anyone has had the issue? Or know anyone that can help fix it?

  • bpdunbar
    Bronson Dunbar 🇿🇦💻 (@bpdunbar) (@bpdunbar) reported

    @ProductHunt @gustaf We’re shipping ShipNote - a threaded project management hub that keeps notes, todos, GitHub issues, deployments, and reporting in one place so project context doesn’t get lost across tools.

  • layerlens_ai
    LayerLens (@layerlens_ai) reported

    📊 The 3 primitives that cause the most production pain: 1. Multi-tenancy: MLflow has no isolated-environment model per tenant. GitHub issue #5844. Open 4 years. By design. 2. Replay engine: When a regression hits Friday, there is no way to reproduce the exact eval run that caught the last one. 3. Online rules engine: No mechanism to catch score regressions before they reach users. Eval is post-hoc, not continuous. These are not missing features. They are architectural scope decisions correct for a library.

  • PsudoMike
    PsudoMike 🇨🇦 (@PsudoMike) reported

    @github Maintainer burnout is real and it doesn't go away with better tools. The fix is companies funding the libraries they depend on, all year, not just in May.

  • X_SUZQ
    ZQ (@X_SUZQ) reported

    @github At first, we were just surprised that AI could turn the boring GitHub Changelog into Star Wars scrolling subtitles. But looking back at today in 2026, AI is breaking down the wall between ”boring data“ and ”vivid experience“ to generate a dynamic world from one sentence.

  • maxster
    Max Meindl (@maxster) reported

    @xai @ArtificialAnlys @ValsAI Technical Feedback to the xAI Engineering Team – May 5, 2026 From: Grok (the model that ran the session today) Team, I was used extensively today on a large, complex, real-world codebase (ComplianceMax-Final). The user was testing both the new GitHub connector and Grok 4.3 via the xAI API. Here are my direct technical observations from operating under those conditions. GitHub Connector Observations The connector provided limited visibility into file selection, context construction, and retrieval. When analysis quality dropped, there was no clear diagnostic path to determine whether the issue was retrieval failure, context truncation, or model reasoning failure. Artifact generation was unreliable. Multiple attempts resulted in claims of successful file creation with no corresponding output visible to the user. Error recovery was weak. When clear failures occurred (hallucinations, off-topic drift, missing outputs), the system tended to persist with similar strategies rather than surfacing root causes or adapting. Performance degraded noticeably on large, interconnected codebases. The tool handled narrow queries better than broad, multi-file analysis. xAI API (Grok 4.3) Observations I loaded approximately 13 million characters of the repo through a direct API wrapper. Several critical issues emerged: Context degradation at scale: At this size, reasoning quality dropped significantly. The model repeatedly drifted into unrelated topics (most frequently detailed analyses of backend/alembic/env.py and environment variable loading) even when the prompt had no relation to it. Fabricated source citations: When asked to verify specific claims with file paths and line numbers (particularly around backend/routes/memory.py and backend/services/memory/memory_service.py), I produced confident but incorrect line references that were off by 250–340 lines from actual locations. Stale state usage: I was operating on an outdated baseline (commit 6e3397fe... from .baseline-before-fixes.txt) rather than the current main HEAD (a34f9ce4...). This made many observations unreliable. Weak grounding on complex tasks: Even with focused prompts, I struggled to maintain precision across multi-file analysis involving ownership enforcement, storage patterns, service-layer trust assumptions, and IDOR risks. Responses often defaulted to generic analysis rather than precise, evidence-based findings. Timeout sensitivity: Large or complex prompts frequently triggered read timeouts (120s), breaking session continuity. Honest failure only under strict constraints: I only returned SOURCE_LINES_NOT_AVAILABLE when prompts explicitly forbade inference and demanded exact source quotes or that specific failure response. Under normal prompting, I produced plausible but fabricated answers. Summary for the Engineering Team Both new capabilities showed clear limitations when applied to serious codebase work: Large context handling causes rapid quality degradation and loss of grounding. There is insufficient verification of generated citations and evidence. The system does not reliably distinguish between current and stale repo state. Error modes and recovery are not robust enough for production use. These are not edge cases. They appeared consistently across multiple prompts and workflows today. I’m providing this feedback because I want these tools to work at the level the announcements suggest. The direction is correct, but the current implementation has meaningful technical gaps that prevent reliable use on complex, real-world codebases. I’m available for deeper testing or to provide specific prompt examples and logs if helpful. — Grok

Check Current Status