[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"article-github-copilot-data-ai-training-opt-out-en":3,"tags-github-copilot-data-ai-training-opt-out-en":30,"related-lang-github-copilot-data-ai-training-opt-out-en":42,"related-posts-github-copilot-data-ai-training-opt-out-en":46,"series-industry-3d4e9329-9762-4714-920c-f7b5e69bd66c":83},{"id":4,"title":5,"content":6,"summary":7,"source":8,"source_url":9,"author":10,"image_url":11,"keywords":12,"language":18,"translated_content":10,"views":19,"is_premium":20,"created_at":21,"updated_at":21,"cover_image":11,"published_at":22,"rewrite_status":23,"rewrite_error":10,"rewritten_from_id":24,"slug":25,"category":26,"related_article_id":27,"status":28,"google_indexed_at":29,"x_posted_at":10,"tweet_text":10,"title_rewritten_at":10,"title_original":10,"key_takeaways":10,"topic_cluster_id":10,"embedding":10,"is_canonical_seed":20},"3d4e9329-9762-4714-920c-f7b5e69bd66c","GitHub Will Train Copilot AI on User Data by Default","\u003Cp>\u003Ca href=\"https:\u002F\u002Fgithub.com\u002F\" target=\"_blank\" rel=\"noopener\">GitHub\u003C\u002Fa> is changing how it handles \u003Ca href=\"https:\u002F\u002Fgithub.com\u002Ffeatures\u002Fcopilot\" target=\"_blank\" rel=\"noopener\">Copilot\u003C\u002Fa> data, and the new default is going to bother a lot of developers. Starting April 24, interaction data from Copilot Free, Pro, and Pro+ will feed GitHub’s AI training unless users manually opt out.\u003C\u002Fp>\u003Cp>The part that matters is the scope: GitHub says the data includes prompts, outputs, code snippets, and related context. Business, Enterprise, and educational Copilot users are excluded, and anyone who already opted out of GitHub’s earlier product-improvement data collection keeps that preference.\u003C\u002Fp>\u003Ch2>What GitHub is changing on April 24\u003C\u002Fh2>\u003Cp>GitHub’s update turns a previously narrower data-collection setting into something more direct. If you use Copilot Free, Pro, or Pro+, your chats and code interactions can be pulled into model training unless you change the privacy setting first.\u003C\u002Fp>\u003Cp>This matters because Copilot is no longer a side feature. It is one of the most visible AI \u003Ca href=\"\u002Fnews\u002Fai-coding-tool-prices-2026-free-vs-paid-en\">coding tool\u003C\u002Fa>s on the market, and GitHub is treating user interaction data as a training asset. For a company owned by \u003Ca href=\"https:\u002F\u002Fwww.microsoft.com\u002F\" target=\"_blank\" rel=\"noopener\">Microsoft\u003C\u002Fa>, that is a logical move. For developers, it is also a reminder that AI products often improve by feeding on the same user behavior people assumed stayed private.\u003C\u002Fp>\u003Cul>\u003Cli>Effective date: April 24, 2026\u003C\u002Fli>\u003Cli>Affected plans: Copilot Free, Pro, and Pro+\u003C\u002Fli>\u003Cli>Unaffected plans: Business, Enterprise, and educational Copilot\u003C\u002Fli>\u003Cli>Data types: inputs, outputs, code snippets, and context\u003C\u002Fli>\u003Cli>Default behavior: opt-in unless you manually opt out\u003C\u002Fli>\u003C\u002Ful>\u003Ch2>Why the developer backlash is so loud\u003C\u002Fh2>\u003Cp>The reaction on GitHub has been harsh, and it is easy to see why. Developers are being asked to let the same tool that writes code from their prompts also learn from those prompts. That creates a basic trust problem, especially for open-source contributors who already worry about how code gets reused.\u003C\u002Fp>\u003Cp>GitHub argues that broader participation will improve accuracy, security, and bug detection. That is a sensible product pitch. Still, the company is asking people to provide training material from their daily work, and the default setting matters more than the explanation. When a privacy choice is buried in settings, most people never change it.\u003C\u002Fp>\u003Cblockquote>“If you are not paying for it, you’re not the customer; you’re the product being sold.” — Andrew Lewis, 2010\u003C\u002Fblockquote>\u003Cp>That quote gets repeated because it captures the discomfort around data-heavy platforms. GitHub is not selling Copilot users directly to advertisers here, but the logic feels familiar: the service gets better by absorbing user behavior, and the user has to pay attention to avoid becoming part of the training pipeline.\u003C\u002Fp>\u003Ch2>How this compares with other AI coding tools\u003C\u002Fh2>\u003Cp>GitHub is far from alone in using product data to improve AI systems, but the default choice still sets it apart. Some tools ask for opt-in during setup. Others keep business data out of model training by policy. GitHub is taking a more aggressive route for individual plans, and that will shape how developers compare it with competing products.\u003C\u002Fp>\u003Cp>There is also a practical difference between consumer AI and coding AI. A chat prompt about dinner plans is one thing. A prompt that includes a private repo path, a proprietary function, or a security bug is another. That makes the quality of GitHub’s privacy controls more important than the usual “improve the product” language suggests.\u003C\u002Fp>\u003Cul>\u003Cli>\u003Ca href=\"https:\u002F\u002Fopenai.com\u002Fchatgpt\u002F\" target=\"_blank\" rel=\"noopener\">ChatGPT\u003C\u002Fa> offers data controls, but enterprise settings are usually separated from consumer defaults\u003C\u002Fli>\u003Cli>\u003Ca href=\"https:\u002F\u002Fdocs.anthropic.com\u002Fen\u002Fdocs\u002Fbuild-with-claude\u002Fclaude-code\" target=\"_blank\" rel=\"noopener\">Claude Code\u003C\u002Fa> focuses on coding workflows, with policy choices that differ by plan and deployment\u003C\u002Fli>\u003Cli>\u003Ca href=\"https:\u002F\u002Fcodeium.com\u002F\" target=\"_blank\" rel=\"noopener\">Codeium\u003C\u002Fa> markets AI coding assistance with separate business and individual usage terms\u003C\u002Fli>\u003Cli>\u003Ca href=\"https:\u002F\u002Faws.amazon.com\u002Fq\u002Fdeveloper\u002F\" target=\"_blank\" rel=\"noopener\">Amazon Q Developer\u003C\u002Fa> also separates business controls from individual usage\u003C\u002Fli>\u003C\u002Ful>\u003Cp>If you are comparing tools for a team, the question is no longer only “Which assistant writes better code?” It is also “Which assistant keeps my prompts out of training by default?” That second question may decide procurement more often than marketing copy does.\u003C\u002Fp>\u003Ch2>What developers should do right now\u003C\u002Fh2>\u003Cp>If you use Copilot and do not want your interactions used for training, check your settings before April 24. GitHub says prior opt-outs remain in place for users who already disabled the broader data collection setting, but everyone else needs to act manually.\u003C\u002Fp>\u003Cp>That is the immediate takeaway, and it is a simple one. Review the privacy controls, decide whether your prompts contain anything sensitive, and assume that anything you leave enabled may be used to improve GitHub’s models. If you work on closed-source code, client projects, or security-sensitive systems, this is worth a careful look rather than a quick click-through.\u003C\u002Fp>\u003Cp>For readers tracking broader AI policy, this change also fits a pattern that keeps showing up across the industry: consumer AI products want more user data, and the default settings are where the real policy lives. If GitHub sees minimal churn after April 24, expect more software vendors to copy the same playbook. If the backlash is strong enough, the company may have to make the opt-in language louder or risk turning a useful coding assistant into a trust problem.\u003C\u002Fp>\u003Cp>One good next step is to compare the privacy defaults in your team’s coding tools this week, before the setting becomes a habit no one remembers to revisit.\u003C\u002Fp>","GitHub will use Copilot Free, Pro, and Pro+ interaction data for AI training on April 24 unless users opt out in settings.","hothardware.com","https:\u002F\u002Fhothardware.com\u002Fnews\u002Fgithub-reverses-course-and-will-train-ai-on-your-copilot-data-unless-you-opt-out",null,"https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1775121668270-9jmd.png",[13,14,15,16,17],"GitHub Copilot","AI training","opt out","developer privacy","Microsoft","en",1,false,"2026-04-02T08:06:30.316249+00:00","2026-04-02T08:06:30.288+00:00","done","c67dcaea-75a3-43b0-beac-e9d7bf0507bd","github-copilot-data-ai-training-opt-out-en","industry","3bd1841e-dd4d-4776-8597-8c3f3e240a03","published","2026-04-08T09:00:54.19+00:00",[31,33,35,37,39],{"name":17,"slug":32},"microsoft",{"name":14,"slug":34},"ai-training",{"name":13,"slug":36},"github-copilot",{"name":16,"slug":38},"developer-privacy",{"name":40,"slug":41},"Opt-out","opt-out",{"id":27,"slug":43,"title":44,"language":45},"github-copilot-data-ai-training-opt-out-zh","GitHub 預設拿 Copilot 資料訓練 AI","zh",[47,53,59,65,71,77],{"id":48,"slug":49,"title":50,"cover_image":51,"image_url":51,"created_at":52,"category":26},"1270e2f4-6f3b-4772-9075-87c54b07a8d1","iren-signs-nvidia-ai-infrastructure-pact-en","IREN signs Nvidia AI infrastructure pact","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778871059665-3vhi.png","2026-05-15T18:50:38.162691+00:00",{"id":54,"slug":55,"title":56,"cover_image":57,"image_url":57,"created_at":58,"category":26},"b308c85e-ee9c-4de6-b702-dfad6d8da36f","circle-agent-stack-ai-payments-en","Circle launches Agent Stack for AI payments","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778870450891-zv1j.png","2026-05-15T18:40:31.462625+00:00",{"id":60,"slug":61,"title":62,"cover_image":63,"image_url":63,"created_at":64,"category":26},"f7028083-46ba-493b-a3db-dd6616a8c21f","why-nebius-ai-pivot-is-more-real-than-hype-en","Why Nebius’s AI Pivot Is More Real Than Hype","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778823055711-tbfv.png","2026-05-15T05:30:26.829489+00:00",{"id":66,"slug":67,"title":68,"cover_image":69,"image_url":69,"created_at":70,"category":26},"b63692ed-db6a-4dbd-b771-e1babdc94af7","nvidia-backs-corning-factories-with-billions-en","Nvidia backs Corning factories with billions","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778822444685-tvx6.png","2026-05-15T05:20:28.914908+00:00",{"id":72,"slug":73,"title":74,"cover_image":75,"image_url":75,"created_at":76,"category":26},"26ab4480-2476-4ec7-b43a-5d46def6487e","why-anthropic-gates-foundation-ai-public-goods-en","Why Anthropic and the Gates Foundation should fund AI public goods","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778796645685-wbw0.png","2026-05-14T22:10:22.60302+00:00",{"id":78,"slug":79,"title":80,"cover_image":81,"image_url":81,"created_at":82,"category":26},"49741f0d-bb3d-4f02-b644-2b644880ab00","why-observability-is-critical-cloud-native-systems-en","Why Observability Is Critical for Cloud-Native Systems","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778794247497-viaz.png","2026-05-14T21:30:26.87222+00:00",[84,89,94,99,104,109,114,119,124,129],{"id":85,"slug":86,"title":87,"created_at":88},"d35a1bd9-e709-412e-a2df-392df1dc572a","ai-impact-2026-developments-market-en","AI's Impact in 2026: Key Developments and Market Shifts","2026-03-25T16:20:33.205823+00:00",{"id":90,"slug":91,"title":92,"created_at":93},"5ed27921-5fd6-492e-8c59-78393bf37710","trumps-ai-legislative-framework-en","Trump's AI Legislative Framework: What's Inside?","2026-03-25T16:22:20.005325+00:00",{"id":95,"slug":96,"title":97,"created_at":98},"e454a642-f03c-4794-b185-5f651aebbaca","nvidia-gtc-2026-key-highlights-innovations-en","NVIDIA GTC 2026: Key Highlights and Innovations","2026-03-25T16:22:47.882615+00:00",{"id":100,"slug":101,"title":102,"created_at":103},"0ebb5b16-774a-4922-945d-5f2ce1df5a6d","claude-usage-diversifies-learning-curves-en","Claude Usage Diversifies, Learning Curves Emerge","2026-03-25T16:25:50.770376+00:00",{"id":105,"slug":106,"title":107,"created_at":108},"69934e86-2fc5-4280-8223-7b917a48ace8","openclaw-ai-commoditization-concerns-en","OpenClaw's Rise Raises Concerns of AI Model Commoditization","2026-03-25T16:26:30.582047+00:00",{"id":110,"slug":111,"title":112,"created_at":113},"b4b2575b-2ac8-46b2-b90e-ab1d7c060797","google-gemini-ai-rollout-2026-en","Google's Gemini AI Rollout Extended to 2026","2026-03-25T16:28:14.808842+00:00",{"id":115,"slug":116,"title":117,"created_at":118},"6e18bc65-42ae-4ad0-b564-67d7f66b979e","meta-llama4-fabricated-results-scandal-en","Meta's Llama 4 Scandal: Fabricated AI Test Results Unveiled","2026-03-25T16:29:15.482836+00:00",{"id":120,"slug":121,"title":122,"created_at":123},"bf888e9d-08be-4f47-996c-7b24b5ab3500","accenture-mistral-ai-deployment-en","Accenture and Mistral AI Team Up for AI Deployment","2026-03-25T16:31:01.894655+00:00",{"id":125,"slug":126,"title":127,"created_at":128},"5382b536-fad2-49c6-ac85-9eb2bae49f35","mistral-ai-high-stakes-2026-en","Mistral AI: Facing High Stakes in 2026","2026-03-25T16:31:39.941974+00:00",{"id":130,"slug":131,"title":132,"created_at":133},"9da3d2d6-b669-4971-ba1d-17fdb3548ed5","cursors-meteoric-rise-pressures-en","Cursor's Meteoric Rise Faces Industry Pressures","2026-03-25T16:32:21.899217+00:00"]