1. Home
  2. Companies
  3. GitHub
GitHub

GitHub status: access issues and outage reports

No problems detected

If you are having issues, please submit a report below.

Full Outage Map

GitHub is a company that provides hosting for software development and version control using Git. It offers the distributed version control and source code management functionality of Git, plus its own features.

Problems in the last 24 hours

The graph below depicts the number of GitHub reports received over the last 24 hours by time of day. When the number of reports exceeds the baseline, represented by the red line, an outage is determined.

At the moment, we haven't detected any problems at GitHub. Are you experiencing issues or an outage? Leave a message in the comments section!

Most Reported Problems

The following are the most recent problems reported by GitHub users through our website.

  • 58% Website Down (58%)
  • 33% Errors (33%)
  • 9% Sign in (9%)

Live Outage Map

The most recent GitHub outage reports came from the following cities:

CityProblem TypeReport Time
Colima Website Down 2 days ago
Poblete Website Down 2 days ago
Ronda Website Down 3 days ago
Montataire Errors 3 days ago
Montataire Website Down 4 days ago
Tortosa Website Down 6 days ago
Full Outage Map

Community Discussion

Tips? Frustrations? Share them here. Useful comments include a description of the problem, city and postal code.

Beware of "support numbers" or "recovery" accounts that might be posted below. Make sure to report and downvote those comments. Avoid posting your personal information.

GitHub Issues Reports

Latest outage, problems and issue reports in social media:

  • RetroChainer
    RetroChainer (@RetroChainer) reported

    google makes 238 billion dollars a year on ads. one developer wrote a tool that blocks all of it. before it ever reaches your devices. it's called pi-hole. 55,700 stars on github. how it works: > runs on a $35 raspberry pi or any old linux machine > becomes the dns server for your entire network > all ad domains are sunk before your browser ever requests them > nothing installed on your phone, tv, or tablet what it blocks beyond ads: > facebook tracking pixel > google analytics on every site > smart tv telemetry > data brokers listening on your network > app telemetry phoning home the result: smart tv stops loading ads. phone browses clean. kids don't see ads on their tablet. one config file. one evening. a 238-billion-dollar industry neutralized for $35. 100% open source. free forever.

  • syntoythesis
    Bored Devops ☠️🛠⚙️ (@syntoythesis) reported

    @esrtweet As someone who's felt the sting of having to run make -- it was a long time ago, but I remember wondering, "Where's the exe download?" when I first went to Github -- this feels like a a nearly solved problem with LLMs.

  • Capta1nCodes
    Priyal Raj (@Capta1nCodes) reported

    Hey devs, I need a small help. I have a CI/CD for Github Actions, the issue is I am getting an error and it's hard to re-deploy on GH Actions everything, can I run those CI/CD directly from my PC? Locally I mean? About to GPt, but want your inputs too.

  • alphabatcher
    Alpha Batcher (@alphabatcher) reported

    > followed 200 launch accounts > watched every demo video > saved every “tools you need” thread > built nothing > opened GitHub > clicked karpathy > found nanoGPT and llm.c > clicked ggerganov > realized local AI was built by people doing hard C++ work > clicked Tim Dettmers > understood why QLoRA changed who can finetune > clicked Paul Gauthier > saw aider treating *** like the agent’s memory > clicked Simon Willison > found tiny tools that actually survive contact with reality that was the shift GitHub is a map of where the next products come from every account tells you one thing: what just became possible? > karpathy makes the model understandable > ggerganov makes it local > Tri Dao makes it faster > Tim Dettmers makes it cheaper > Yohei makes the loop weird enough to copy > aider makes coding agents usable > Instructor makes outputs reliable > LlamaIndex makes company data usable > Ollama makes local models installable then your job is to build the missing boring layer > UI > workflow > templates > vertical packaging > docs > benchmarks > hosted version > done-for-you setup most people star repos and feel technical builders run the repo, break it, and ship the weekend wrapper > pick 3 accounts > read the README > run the code > open the issues > find the missing layer > ship one tiny thing by Friday

  • byLuocca
    Luca Marchetti (@byLuocca) reported

    I am building my first product Today I’m building the login What makes the biggest difference: - magic links - Login with Google - Login with GitHub

  • esrtweet
    Eric S. Raymond (@esrtweet) reported

    planefag, I'm not excusing the attitude of the guy who pissed you off. But there is an explanation for it, and I'm going to put on my Mister Open Source hat and lay it on you. The real reason there aren't prominent links to downloadable binaries on forge sites like GitHub is that in open-source land there is no such thing as a truly portable binary. Windows and Mac make binary distribution easy by being limited to a single hardware platform and a single ABI - application binary interface.. (The assertion I just made can be quibbled with at the edges. I will be unkind to anyone who attempts this.) An application binary interface is a set of conventions for how you decorate your binary so the operating system's program loader knows what to do with it, and how you write traps from your binary to call operating system services. Windows and Mac have, effectively, just one ABI each. So you can generate one binary for, say, Windows, attach it to a download link, and Windows users will generally not come back screaming for your blood because it fails to work in some obscure way. (Again, this statement can be quibbled with, but see this whacking great truncheon in my hand? Just don't.) There is no such grace in open-source land. There are a whole bunch of complicated historical reasons for this, starting with the fact that Linux runs on more different hardware architectures, and continuing with the fact that Linux isn't the only game in town (there are the BSDs), and continuing into technical minutiae that would make your head hurt, and continuing further into technical minutiae that make *my* head hurt. But what this actually means is that if you want to provide binaries and not get sperg-screamed at, you can't just provide one. You'd have to provide many, and no matter how comprehensive you try to be somebody is going to be disgruntled because you didn't cover their corner case. This is not a cost-free proposition. For each different kind of binary you provide, you need to cross-compile your source code in a different environment, many of them posted on distributions and hardware platforms you don't have routine access to. So people almost never do it at all. Because most projects don't do this, sites like GitHub don't see any demand push to make binary download links really accessible. Instead, the problem is normally handled at a different level. Your distribution maker keeps huge sets of compiled binaries lightly hidden inside of installable packages, tuned for the ABI of that single distribution. Your package manager hides from you the packages for everything but your hardware architecture The person who pissed you off was rude, but he wasn't exactly wrong about the objective facts. What you want isn't practically possible. Instead of being annoyed because GitHub doesn't feature binary-download links, search for that software using your package manager. Sometimes you won't find it. That's when you have to download source bust out a compiler. Sorry, but that's the way it is. We're trying as hard as we can - really, we are. But the complicated shape of the terrain constrains what we can achieve.

  • VibeCoderOfek
    Ofek Shaked (@VibeCoderOfek) reported

    Just watched an agent go from GitHub issue to merged PR in one pass. This isn’t ‘AI helps you code’ anymore this is the terminal becoming the new IDE. My backend workflows are about to look prehistoric.

  • SRKDAN
    SRKDAN (@SRKDAN) reported

    2/ WHAT SPECIFICALLY CHANGED GPT-5.5 was retrained end-to-end for agentic work. 82.7% on Terminal-Bench 2.0, testing complex command-line workflows requiring planning, iteration, and tool coordination. 58.6% on SWE-Bench Pro, resolving real GitHub issues end-to-end in a single pass. Uses fewer tokens than GPT-5.4 at the same latency. Smarter and cheaper.

  • Oluwaphilemon1
    FHILY👑 (@Oluwaphilemon1) reported

    BREAKING NEWS: This 13-year-old Thai student solves Codeforces rating 800 problems in C++ in 45 seconds through an AI agent he built himself on Claude Code and posted to an open GitHub repo. He sits in a regular school room with a MacBook Air on the desk, a silent HHKB Type-S keyboard for $300, and a timer in the frame. In the browser Codeforces is open, in VS Code an empty .cpp file, and in the corner of the screen a Claude Code window.

  • Kiwi_Nod
    KiwiNod (@Kiwi_Nod) reported

    @OludayoDamilol @pharos_network Front-end race conditions on mobile? Okay, that's a real attack vector most auditors miss. 📱 But "logic" isn't a GitHub repo. Show me a write-up, a PoC, or even a thread breaking down a specific exploit you found. What dApp did you break? Where's the teardown? 🥝

  • mylifcc
    lifcc (@mylifcc) reported

    @mattpocockuk the backlog-burn angle lands. been writing Claude Code Skills lately — hardest part is making priority calls reproducible instead of LLM vibes. does /triage hit GitHub/Linear APIs directly, or do you paste the issue list in? curious how you keep dedup stable across runs.

  • makinola86
    LÏÇHÄ (@makinola86) reported

    @Atom_Adeyemi Pls drop the step 6 github link to copy it's showing error

  • codersGyan
    Rakesh K (@codersGyan) reported

    The “what language should I learn next?” The question shows up a lot. Every week, someone asks: Should I learn Go or Rust next? Is Bun worth it? Should I pick up Elixir? Then you check their Github. 4 incomplete projects. It’s pattern The truth nobody likes hearing : Senior backend engineers don’t get hired for the languages on their resume. They get hired for systems they’ve actually shipped. A finished, deployed app beats five tutorial repos. Even if it’s small. Even if it’s messy. Even if three people use it. Because once real users touch it, things change. Something breaks. Something is slower than expected. Something behaves differently in production. That’s where the real learning starts. Pick the language you already half-know. Build something real. Put users on it. Watch what breaks. Fix it. Then your “should i learn X next?” question has an answer. Because you’ll know exactly what your current stack can’t do.

  • PrajwalTomar_
    Prajwal Tomar (@PrajwalTomar_) reported

    Most people using Claude are wasting HOURS re-explaining the same thing every session. This CLAUDE .md hit #1 on GitHub with 82K stars and most Claude users still don't know it exists. This permanently fixes the repetition problem.

  • remilouf
    Rémi (@remilouf) reported

    No one is going to replace GitHub with a "more reliable GitHub", they will fix it before you go to market. There’s an opportunity with the AI thing going on, but it’s not going to look like a GitHub competitor for a few years.

  • crystalwizard
    Crystalwizard (@crystalwizard) reported

    suggest you get the code for ffmpeg from their github and fix the issues that have recently been pointed out to them, that they refuse to fix, and put that version up for your users to download

  • ToadSprockett
    Paul Davis (@ToadSprockett) reported

    After 10 years, I finally shut down my GitHub account. I'd been paying $120 a year for a secure remote location to store my code, but over the last year it's become unstable. When I went to create a simple wiki so I could access my Markdown files, they forced me to make the repository public — keeping it private would have cost an extra $50 a year. There's an adage in integration work: you inherit all the problems of the system you're integrating with. So I moved everything over to GitLab and haven't looked back. Then Friday, I deleted everything on GitHub and canceled it all. I'm a solo game developer — I just need something that works. I don't need all these problems they're dealing with. Whether it's AI or just lazy coding, it doesn't matter; I can't afford the nonsense. Good luck to those who are staying.

  • rubelr44
    Red (@rubelr44) reported

    you're paying google $10/month to sit in their server room. dropbox gets $12/month. apple gets $10. the kicker? they can all see your stuff. and when dropbox got breached in 2024? emails, passwords, and tokens were just... out there. there’s this tool called syncthing and it’s honestly kind of a cheat code. no cloud. no company servers. no middleman watching you. it just syncs your files directly between your own devices. peer-to-peer. it's got like 81k stars on github so it’s legit. here is why it wins: direct sync: files go from your phone to your pc. they never touch a 3rd party. privacy: encrypted with tls and crypto certificates. zero friction: no accounts. no sign-ups. just install it and share a device id. everywhere: works on windows, mac, linux, android... even solaris if you're into that. safety net: it has file versioning. if you accidentally delete something, you can just roll it back. the wildest part is that syncthing isn't even a company. it's a swedish non-profit. there is no "cloud" to shut down. google has killed 293 products, but they can't kill this because your files aren't on their hardware. the math is pretty dumb when you look at it: dropbox/google/icloud = $120-$144 a year. syncthing = $0. unlimited storage. unlimited devices. it's been around since 2013 and it's 100% open source. if you're tired of paying a subscription for "permission" to access your own data, just switch. your hardware. your files. forever.

  • TesfiApp
    Can (@TesfiApp) reported

    my full stack — what I actually pay to run this thing every month Claude = coding. ($20/mo) Supabase = backend. (Free) Vercel = deploying. (Free) TestFi = user testing. ($1.99/tester) Namecheap = domain. ($12/yr) Stripe = payments. (2.9%/transaction) GitHub = version control. (Free) Resend = emails. (Free) Cloudflare = DNS. (Free) PostHog = analytics. (Free) Sentry = error tracking. (Free) Total: ~$20/month. No team, no VC, 8 months in.

  • SurgeonOnChain
    Surgeon On Chain (@SurgeonOnChain) reported

    Next day, brute-forced a fix. Downloaded MetaMask v13 unpacked from GitHub (v14 removed the SRP-import path), loaded it via Playwright, imported the v3 key, clicked through Polymarket UI: connect →sign → enable trading →sign again. The Enable Trading sign was the missing step.

  • LLMJunky
    am.will (@LLMJunky) reported

    @emanueledpt no, thats the normal github screen lol. its a joke because its also, frequently down

  • fabulouskid4u
    Omoboye Moses (@fabulouskid4u) reported

    They Found One Person 6 Months Ahead and Studied How They Think Not a guru with 500,000 followers. Someone close enough to be relevant. Far enough to be useful. Study their GitHub commits. Read their write-ups. Ask specific questions about their problem-solving process. Proximity to the right thinking accelerates everything.

  • Anupam_Devops
    Anupam (@Anupam_Devops) reported

    You don't need to know everything in DevOps. I know. The roadmap said otherwise. You opened it. Saw Kubernetes, Terraform, Ansible, Docker,Jenkins, GitHub Actions, Prometheus, Grafana, Vault,ArgoCD, Helm, Istio, Pulumi, AWS, GCP, Azure… And you thought: I need all of this before I'm ready. You don't. That's the trap. The overload trap: 🔴  Jump between 10 tools, master none 🔴  Start a new course every time you see a job posting 🔴  Feel perpetually behind — because the list never ends 🔴  Ship nothing. Build nothing. Just consume. What actually works: 🟢  Pick one cloud. Go deep — not wide. 🟢  Learn Docker → then Kubernetes. In that order. Slowly. 🟢  Build a real pipeline end-to-end. Break it. Fix it. 🟢  Understand why a tool exists before learning how to use it 🟢  One project shipped beats 12 tutorials watched. The engineers who know everything? They don't exist. They just know their stack really well and know how to learn the rest fast when they need it. Breadth comes with time. Depth is what gets you hired. Stop collecting tools. Start building fluency.

  • invokespecial
    Alejandro (@invokespecial) reported

    @GitHub @kdaigle this billing flow feels broken: Buy Copilot Pro ($100/year) Upgrade to Pro+ Get assigned a Copilot seat by an org → partial refund Lose the seat → your $100 is gone That can’t be right… right? Ticket: 4298296

  • caspian_ji
    Caspian (@caspian_ji) reported

    @peer_rich ive not once experienced github being down and im on it every single day folks who are whining are not paying for enterprise pay up and you wont have these issues

  • ZachSDaniel1
    Zach Daniel (@ZachSDaniel1) reported

    @InfinityDZ @RootCert Not currently. We have many of the bases covered. Only other things are oauth2 server and file storage(in GitHub now but unreleased).

  • Sneha625885
    Sneha (@Sneha625885) reported

    Today, i contribute to 3 problems on GitHub in c++ (easy problem). And made a responsive face card website with html and css . # GitHub #CodingLab

  • ChronCode
    CodeChron (@ChronCode) reported

    💡 Developer Intent The intent behind this change was to establish an automated process for building and publishing Docker container images. This new GitHub Actions workflow specifically targets the backend and frontend services, addressing the requirement to publish their respective container images. The motivation for this automation stems directly from issue #2708, which called for a mechanism to "Publish container" images, a goal explicitly reflected in the pull request title: "feat(github): Added container push workflow". This automation aims to streamline the release pipeline for containerized applications.

  • jahirsheikh8
    Jahir Sheikh (@jahirsheikh8) reported

    @Saanvi_dhillon They won't; they are busy with GitHub to fix it.

  • Shrit1401
    Shrit (@Shrit1401) reported

    it's so funny github is struggling to be live because pre AI commits were not that much, however everybody is using agentic AI / vibe coding wtvr u name it, we're spamming commits, and github is reaching it's limit for resources it will be interesting to see how they try to solve this problem.