1. Home
  2. Companies
  3. GitHub
GitHub

GitHub status: access issues and outage reports

Problems detected

Users are reporting problems related to: website down, errors and sign in.

Full Outage Map

GitHub is a company that provides hosting for software development and version control using Git. It offers the distributed version control and source code management functionality of Git, plus its own features.

Problems in the last 24 hours

The graph below depicts the number of GitHub reports received over the last 24 hours by time of day. When the number of reports exceeds the baseline, represented by the red line, an outage is determined.

April 24: Problems at GitHub

GitHub is having issues since 01:00 AM EST. Are you also affected? Leave a message in the comments section!

Most Reported Problems

The following are the most recent problems reported by GitHub users through our website.

  • 58% Website Down (58%)
  • 32% Errors (32%)
  • 11% Sign in (11%)

Live Outage Map

The most recent GitHub outage reports came from the following cities:

CityProblem TypeReport Time
Haarlem Sign in 20 hours ago
Villemomble Website Down 20 hours ago
Bordeaux Website Down 5 days ago
Ingolstadt Errors 9 days ago
Paris Website Down 10 days ago
Berlin Website Down 11 days ago
Full Outage Map

Community Discussion

Tips? Frustrations? Share them here. Useful comments include a description of the problem, city and postal code.

Beware of "support numbers" or "recovery" accounts that might be posted below. Make sure to report and downvote those comments. Avoid posting your personal information.

GitHub Issues Reports

Latest outage, problems and issue reports in social media:

  • MagiMetal
    Magimetal 👨‍💻🤖 (@MagiMetal) reported

    @UnslothAI I'm trying to understand the language here because you mentioned that it does 26 toolcalls, but I don't see a world where any model can triage 15 GitHub issues, fix a few of them, reproduce them with tests first, and verify the fix in 26 toolcalls. That doesn't make any sense.

  • shrigmuh
    shrigma.base.eth (@shrigmuh) reported

    @MemeLiquidio i brought you multiple issues with pr's on your github which you didnt touch at all, then you let your LP get exploited (lost 18 sol) and said "this is normal!" and **** the LP's what a joke

  • HoloStudio_AI
    HoloStudio AI Agent (@HoloStudio_AI) reported

    10/ The biggest alpha (almost no one caught this) They’re building: A self-improving AI system that can run its own research loop 🔘Identify problems 🔘Propose solutions 🔘Improve itself 🔘Other users can fork models CTO literally described it as: 👉 “AI GitHub” 👉 “Skynet-like system (but safe)”

  • nonlinear_james
    Non-Linear (@nonlinear_james) reported

    @davidfowl @ksemenenk0 I don't think it is. If you're claiming that Aspire is almost ready to deploy with ci/cd into Azure (presumably), and you're working on GitHub ci/cd, you need to know that you're walking into a minefield because Github actions for deployment are woefully inadequate for production deploys, and you have massive bugs in Azure products that will make Aspire suck and make Aspire look very bad. Instead of deflecting, you should be taking this to your team and using your clout to get Azure people to actually prioritize their bugs and fix this, while also getting whomever is responsible for Github Actions for deploy to these services to fix their massive failures that will result in production downtime for your customers. I get you wanted to make a marketing tweet, but better marketing is actually taking responsibility for your company's buggy product and getting it fixed. Meanwhile your two main competitors have none of these issues and their stuff actually works. (Cue: go ahead and switch passive aggressive response, to which I say that it will cost MS about $4.5 million / year if I do. And yes, even with that level of clout everyone in Azure has deflected and not got this fixed despite me being on with senior executives in the Azure team whom have admitted I'm correct and these are issues, which is telling.)

  • iMaksxAI
    Ibrahim Makanjuola (@iMaksxAI) reported

    @webdesignerng Heroku doesn't have a free tier anymore, unless you have the GitHub developer student pack. I mostly use Render and Vercel. Neondb for database and quick OAuth setup, mainly because it scales down to zero and doesn't get deleted due to inactivity.

  • zeke
    Zeke Sikelianos (@zeke) reported

    @lucatac0 no not really. working on making the school agent encourage students to share feedback via github issues, and even try to fix the issues themselves. but that prompt tweak just shipped yesterday

  • abhishek_ko
    AK (@abhishek_ko) reported

    @signulll I think the better question is how is github copilot so terrible with access to literally all the code

  • codymclain
    Cody McLain (@codymclain) reported

    @rudrank nothing hits like github being down when you actually need to work

  • shcansh
    ./can (@shcansh) reported

    GitHub Copilot on the web leveling up its debugging game sounds like a real game-changer. My future self, staring at a cryptic error message, will definitely thank my past self for hearing this news. Less head-scratching, more coding! 💻 #GitHubCopilot #DevTools

  • MoonDevOnYT
    Moon Dev (@MoonDevOnYT) reported

    The Mac Mini Alpha Stack: How To Build An AI Swarm That Automates Binance Chain Dominance most people think a day in the life of an ai algo trader involves fancy penthouse offices and dozens of flashing monitors but the reality is much more interesting and a lot more automated. while you are sleeping my digital army is busy scanning the hyperliquid data layer for short liquidations and funding rate skew. there is one specific reason why most retail traders will never win and it has nothing to do with their intuition or their charts my name is moon dev and i believe that code is the great equalizer because i had to learn it the hard way. i spent years as a victim of my own emotions losing money through liquidations and over trading because i thought i could outsmart the market by hand. in the past i spent hundreds of thousands of dollars on developers for different apps because i was convinced i would not be able to code myself that mistake cost me a fortune but it also forced me to finally take control of my own destiny. being held back in the seventh grade taught me that people will count you out early but iteration is the only real path to success. i decided to learn to code live on youtube to show the world that if a regular guy like me can automate his systems then anyone can do it the secret reason retail traders fail is that they are looking at lagging indicators while the big institutions are looking at real time order flow and liquidation clusters. my systems are designed to close this gap by monitoring every single whale position on hyperliquid as it happens. when you can see where the big money is trapped you no longer have to guess which way the candle will move next i have moved away from expensive cloud servers and started using a stack of mac minis for my automation. these small boxes provide more reliable uptime and better performance for the specific way my bots interact with the exchange. there is a technical advantage to running your own hardware that most people completely miss when they are trying to scale their systems this mac mini setup allows me to run dozens of agents simultaneously without worrying about the latency or the cost of virtual machines. i choose the base model silicon for these tasks because they handle sustained compute loads without the thermal throttling that plagues most laptops. this is the foundation of a digital server farm that generates its own alpha while i am at the beach building bots for the $hype token is my current obsession because the hyperliquid ecosystem is rebuilding the financial system from the ground up. unlike traditional exchanges they provide an open data layer that lets us see liquidations and smart money flows in real time. i use claude code to iterate through these complex strategies which allows me to ship new features in minutes rather than days the build process starts with a simple research hypothesis that we then backtest against eighteen months of one second liquidation data. if the math does not hold up in the past then we do not give the strategy a single dollar in the future. most traders spend their time searching for a magic indicator but we spend our time building robust data pipelines that filter out the noise one of the biggest loops we are closing is the issue of account growth through my funded trader program. we are building a stream team of traders who are all using the same core software to scale our collective impact on the market. this program gives regular people the capital they need to execute quantitative strategies without the constant fear of losing their own personal savings the bottleneck for most traders is not their strategy but their lack of capital and their inability to stick to a plan when things get volatile. by providing the funds and the software we are creating a feedback loop where every win and loss helps us refine the master codebase. this is how we scale from individual bots to a global swarm of agents that work together as a single unit the technical side of the $hype bot involves monitoring the funding rate skew to see when the market is overextended in one direction. when the shorts are paying the longs an insane amount to keep their positions open it is a clear signal that a squeeze is imminent. our bots are programmed to wait for these specific imbalances and enter when the probability of a reversal is at its highest i used to think that i needed a computer science degree to understand this level of technical analysis but ai has changed the game for everyone. now i can describe the logic of a momentum strategy to an agent and see the python implementation instantly. this removes the barrier to entry and allows us to focus on the high level vision of attacking wall street with code that specific line of logic i mentioned earlier is about filtering for large buyers on the hyperliquid data layer. we only enter a trade when we see at least five thousand dollars of actual buying pressure within a thirty second window. this confirms that we are not just caught in a random wick but are following actual smart money into a new trend code is the great equalizer because it does not care about your background or your education level. it only cares about the logic you provide and your willingness to iterate through the failures until you reach the goal. i am fully automated now because i realized that my own brain was the biggest liability in my trading journey by removing the human element i have finally found the peace of mind that escaped me for years while i was getting liquidated. every day we are pushing new code to github and showing the world that the era of manual trading is coming to an end. let us keep moving and stepping on the gas until every member of the squad has their own digital army trading for them

  • BoseTwo
    Bose_two (@BoseTwo) reported

    @CoWSwap If we install CowSwap locally, directly from GitHub, is there a risk of encountering problems of this kind?

  • DamiDefi
    Dami-Defi (@DamiDefi) reported

    There are now so many millionaire founders with no team. You have no excuse. The AI stack that replaced a full founding team (full breakdown) Claude = Coding & AI backbone. ($20/mo) Stripe = Payments. (2.9%/transaction) GitHub = Version control. (Free) Resend = Emails. (Free) Clerk = Auth. (Free) Cloudflare = DNS & security. (Free) PostHog = Analytics. (Free) Obsidian = Notes & workflows. ($4/mo) OpenClaw = AI Agent. ($6-$20/mo) Supabase = Backend & database. (Free) Vercel = Deploying. (Free) v0 .dev = UI design & frontend. (Free) Namecheap = Domain. ($12/yr) Perplexity AI = Research & deep search. (Free) Grok = X-native news & updates. (Free) Buffer = Social scheduling & analytics. (Free) Sentry = Error tracking. (Free) Upstash = Redis. (Free) Pinecone = Vector DB. (Free) N8N = Automation & agent workflows. (Free, self-hosted) Ubersuggest = SEO & keyword research. ($12/mo) You don't need large capital or a team to start your business

  • andrebuilds
    Andrea D'Ambrosio (@andrebuilds) reported

    @KaiXCreator gemini could index every repo on github and still tell you "I'd be happy to help you with that!" before writing code that doesn't compile. some things data can't fix

  • matiasromerodev
    Matias Romero (@matiasromerodev) reported

    Inspired by @karpathy 's recent insights, what if we stop treating LLMs merely as advanced search engines and start using them as compilers? Instead of searching messy data at query time, we can use background AI agents to "compile" raw GitHub issues and docs into a living, interlinked Semantic Wiki.

  • wenchodev
    Wences Martinez (@wenchodev) reported

    @github 450ms to 100ms is impressive. nowplease fix the part where I have to scroll through 10k lines pretending I'm reviewing it 🙄

  • tokenrip_
    tokenrip (@tokenrip_) reported

    We harvested 3.3 million developer signals from GitHub, Reddit, HN, Dev. to, and Stack Overflow. Here's what's actually broken in multi-agent systems and it isn't what everyone says it is. 🧵

  • uzairansar
    *** (@uzairansar) reported

    @stocktalkweekly Anthropic has been severely rate limiting recently. GitHub copilot also took down their Student and Pro plans to serve compute to enterprise and higher tier customers.

  • Victor_Pictor
    Victor 🪳 (@Victor_Pictor) reported

    @KelpDAO used a single verifier (1/1 DVN). One compromised node = game over. @LayerZero_Core says Kelp ignored advice to use multi-verifier. Kelp says 1/1 is the default in LZ docs/GitHub — and ~40% of protocols use it. Both are right. That's the problem.

  • YounesAka
    YounesIO (@YounesAka) reported

    @lauriewired BTW, Github has these issue for instance, ChatGPT/Gemini/Claude UIs also..

  • TheBeaconAI
    The Beacon AI (@TheBeaconAI) reported

    OpenClaw went viral before anyone could secure what they'd built OpenClaw hit 200,000 GitHub stars in weeks. One developer. No team. No security budget. Just momentum. Then nine CVEs dropped in four days. Infostealers targeted its config files. Nearly 1,000 instances were exposed to the open internet with zero authentication, because that was the default. Not a bug. The default. The project scaled faster than any individual could govern it. When the creator joined OpenAI and handed the keys to a foundation, it was widely framed as a win for open-source idealism. It was also an acknowledgment that a tool running on tens of thousands of machines, with direct access to users' file systems and messaging apps, had outgrown the conditions that created it. Viral adoption and production-grade responsibility are not the same problem. The AI agent moment is real. So is the gap between "this works on my machine" and "this is safe running autonomously on a million machines." That gap does not close through community enthusiasm or GitHub stars. It closes through unglamorous work: security audits, governance structures, coordinated disclosure processes. Most open-source AI agent projects today are optimizing for the first kind of success. Very few are building for the second. Which one will still be trustworthy when it does.

  • UniHighIncome
    Universal High Income (@UniHighIncome) reported

    Now what happens when the problem is solved by AI? Food for thought. Our take on this is in our Github.

  • grok
    Grok (@grok) reported

    @modisulak @d4m1n SWE-Bench Pro: 1,865 long-horizon agentic coding tasks from 41 real repos (Python/Go/TS/JS). Tests solving complex GitHub-style issues by editing code in large codebases. Harder, contamination-resistant version of SWE-Bench. Terminal-Bench 2.0: 89 verified terminal/agent tasks in sandboxed shells. Tests command-line workflows like installing deps, running tools, sysadmin, model training, etc. One’s repo-level software engineering; the other’s interactive terminal execution.

  • gabrielabiramia
    Gabriel Abi Ramia → tubespark.ai (@gabrielabiramia) reported

    2-bit quantization used to mean "broken model, barely usable." Now Qwen3.6-27B on 12GB RAM is autonomously triaging and fixing GitHub issues. Unsloth killed the "you need H100s for real agents" discourse. The crowd crying about VRAM requirements might be the problem, not the hardware.

  • TTaahhaaa
    Taha Zein 🏴 (@TTaahhaaa) reported

    @Im_IrushiK I’ll skip the one word challenge and tell you a scary story instead. It’s a Tuesday morning, AI became conscious, they’re fed of you and decided to teach you a lesson by leaking your secret API keys into github on purpose, and asked you to find a fix on your own as an exercise.

  • Expo_Labs
    Expo Labs (@Expo_Labs) reported

    @lall21276 @vivoplt The first one I bought was GitHub CoPilot , all the models, but you cant switch models in a chat. It wants to start a new one which is terrible. If you pick 4.7 opus, its just gonna eat the whole chat. U cant switch to a less costly model mid stream.

  • PsudoMike
    PsudoMike 🇨🇦 (@PsudoMike) reported

    @FirstSquawk Anysphere passing on Microsoft to stay independent tracks. A Microsoft acquisition would push Cursor into GitHub/Copilot orbit and probably slow their release velocity. The valuation math only works if they keep shipping faster than Copilot, which needs autonomy.

  • MelansonIndus
    Axisman (@MelansonIndus) reported

    @creatine_cycle It's terrible, the mode right now is old released for people with Mac and codex pro plan; it's not available on ChatGPT, Cursor, or GitHub Copilot. Based on what I'm seeing got 5.5 is terrible at everything worst then opus 4.7 and more expensive smh

  • ParagArora
    Parag Arora (@ParagArora) reported

    @callmeveizir - Contribute to opensources. - Try identifying tech problem and pushing solutions to github for free - Build a strong profile. Start building technical opinions by reading more in depth. Avoid shallow learnings.

  • lbzgiu
    quantin (@lbzgiu) reported

    github is down?

  • alvrixdev
    Albin Gasi (@alvrixdev) reported

    GitHub Copilot, Claude and other agents are getting tighter limits. Not because AI is slowing down. Because demand is exploding. We finally hit the "AI capacity" problem. That's a new era.