[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"article-pion-spectrum-preserving-optimizer-llms-zh":3,"tags-pion-spectrum-preserving-optimizer-llms-zh":38,"related-lang-pion-spectrum-preserving-optimizer-llms-zh":47,"related-posts-pion-spectrum-preserving-optimizer-llms-zh":51,"series-research-7a3313f6-54dd-4313-bff3-ea9ba4eb31d4":88},{"id":4,"title":5,"content":6,"summary":7,"source":8,"source_url":9,"author":10,"image_url":11,"keywords":12,"language":19,"translated_content":10,"views":20,"is_premium":21,"created_at":22,"updated_at":22,"cover_image":11,"published_at":23,"rewrite_status":24,"rewrite_error":10,"rewritten_from_id":25,"slug":26,"category":27,"related_article_id":28,"status":29,"google_indexed_at":30,"x_posted_at":10,"tweet_text":10,"title_rewritten_at":10,"title_original":10,"key_takeaways":31,"topic_cluster_id":35,"embedding":36,"is_canonical_seed":37},"7a3313f6-54dd-4313-bff3-ea9ba4eb31d4","Pion 用正交變換鎖住權重譜","\u003Cp data-speakable=\"summary\">Pion 用左右正交變換更新 \u003Ca href=\"\u002Ftag\u002Fllm\">LLM\u003C\u002Fa> 權重，讓奇異值保持不變。\u003C\u002Fp>\u003Cp>大型語言\u003Ca href=\"\u002Fnews\u002Falphagrpo-self-reflective-multimodal-generation-zh\">模型\u003C\u002Fa>訓練，大家最熟的是 Adam 這類加法式優化器。做法很直觀：把更新量直接加到權重上。但這篇論文想走另一條路。它認為，對某些矩陣來說，訓練不一定非得靠「加」；也可以在不改變核心譜性質的前提下，去改變權重本身。\u003C\u002Fp>\u003Cp>這篇論文是 \u003Ca href=\"https:\u002F\u002Farxiv.org\u002Fabs\u002F2605.12492\">Pion: A Spectrum-Preserving Optimizer via Orthogonal Equivalence Transformation\u003C\u002Fa>。它的重點很清楚：不是把梯度直接疊到參數上，而是用正交等價變換去更新每個權重矩陣。結果是，模型在訓練過程中仍然會變，但奇異值會被保留下來。\u003C\u002Fp>\u003Ch2>它想解的痛點是什麼\u003C\u002Fh2>\u003Cp>傳統優化器的核心思路，是讓參數往損失函數變小的方向走。這套方法很成熟，也很有效，但它有個副作用：權重矩陣的譜結構可能跟著漂移。對一般工程實作來說，這不一定是問題；但如果你在意矩陣的幾何性質，這種漂移就不是小事。\u003C\u002Fp>\n\u003Cfigure class=\"my-6\">\u003Cimg src=\"https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778653862621-6bth.png\" alt=\"Pion 用正交變換鎖住權重譜\" class=\"rounded-xl w-full\" loading=\"lazy\" \u002F>\u003C\u002Ffigure>\n\u003Cp>作者想處理的，就是這個「訓練要進步，但又不想破壞矩陣結構」的矛盾。Pion 的設計目標，是讓權重矩陣在更新時保持奇異值不變，也就是保留譜資訊，同時仍然能讓模型參數產生實際變化。\u003C\u002Fp>\u003Cp>這個方向對研究者來說很有意思，因為它把優化問題從單純的數值調整，拉到矩陣幾何的層次。對開發者來說，這代表優化器不只是收斂快慢的差別，也可能是在改變模型內部結構時，選擇要保留哪些性質。\u003C\u002Fp>\u003Ch2>Pion 到底怎麼做\u003C\u002Fh2>\u003Cp>Pion 的核心關鍵字是 orthogonal equi\u003Ca href=\"\u002Fnews\u002Flongmemeval-v2-agent-memory-web-workflows-zh\">val\u003C\u002Fa>ence transformation，也就是正交等價變換。白話講，它不是在權重矩陣上做加法，而是把矩陣放在左右兩側，分別乘上正交矩陣。這類變換有個重要特性：會保留長度與角度，因此在這種設計下，也能保留奇異值。\u003C\u002Fp>\u003Cp>所以，Pion 的更新方式跟 Adam 不一樣，也跟論文摘要裡提到的 Muon 這類加法式優化器不同。它不是把一個更新量直接塞進參數，而是透過結構化的變換去改變矩陣。作者明講，這種做法是在調節權重矩陣的幾何結構，同時維持其 spectral norm 不變。\u003C\u002Fp>\u003Cp>從工程角度看，這代表優化器的「更新原語」被換掉了。不是 gradient add，而是 matrix transform。這種設計通常會牽涉更多數學約束，也意味著訓練流程不再只是把學習率調好就結束。論文還提到，他們有系統地檢視設計選項，並分析收斂行為與一些關鍵性質。\u003C\u002Fp>\u003Cp>不過，根據目前提供的 raw 資料，摘要沒有把所有實作細節講完整。也就是說，我們知道它是怎麼一類的方法，但不能從摘要直接推出每個訓練迴圈元件怎麼落地。這點很重要，因為它提醒我們：Pion 是一個明確的數學式優化器，不是單純一句「把梯度換個寫法」而已。\u003C\u002Fp>\u003Ch2>論文實際證明了什麼\u003C\u002Fh2>\u003Cp>就現有摘要來看，作者主張 Pion 是一個穩定、而且有競爭力的替代方案，可用在 LLM pretraining 和 finetuning。這是目前能從原始資料確認的主要實證結論。\u003C\u002Fp>\n\u003Cfigure class=\"my-6\">\u003Cimg src=\"https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778653862184-gvyn.png\" alt=\"Pion 用正交變換鎖住權重譜\" class=\"rounded-xl w-full\" loading=\"lazy\" \u002F>\u003C\u002Ffigure>\n\u003Cp>但也要講清楚限制：提供的內容裡沒有 \u003Ca href=\"\u002Ftag\u002Fbenchmark\">benchmark\u003C\u002Fa> 表格、沒有準確率、沒有吞吐量、沒有 scaling 曲線，也沒有任務清單。換句話說，這篇摘要沒有公開完整 benchmark 細節。所以我們不能說它比哪個方法快多少、準多少，或在哪些資料集上領先。\u003C\u002Fp>\u003Cp>能確認的是，作者不只是提出一個概念，而是往下做了幾件事：推導更新規則、檢查設計選擇、分析收斂行為，以及整理關鍵性質。這表示它不是單點技巧，而是有理論骨架的優化器提案。\u003C\u002Fp>\u003Cp>如果你把它放在 LLM 訓練脈絡裡看，Pion 的價值不在於「又多一個 optimizer 名字」，而是它把一個常被忽略的問題擺上檯面：訓練時，參數不一定只能用加法去更新。你也可以要求某些矩陣性質在過程中被保留。\u003C\u002Fp>\u003Cul>\u003Cli>Pion 在訓練中保留奇異值。\u003C\u002Fli>\u003Cli>它透過左右正交變換更新權重矩陣。\u003C\u002Fli>\u003Cli>作者主張它可用於 LLM pretraining 與 finetuning，且表現穩定、具競爭力。\u003C\u002Fli>\u003Cli>摘要沒有提供完整 benchmark 數字與比較細節。\u003C\u002Fli>\u003C\u002Ful>\u003Ch2>對開發者有什麼影響\u003C\u002Fh2>\u003Cp>對實際做模型訓練的人來說，優化器不是背景元件。它會影響收斂、穩定性，也會影響模型最後學到的表示。Pion 的特殊之處，在於它改變了優化的基本操作：不再是把更新量加到權重上，而是維持一個矩陣譜的不變性。\u003C\u002Fp>\u003Cp>這種設計可能對那些在意訓練穩定性、或在意權重幾何結構的人特別有吸引力。它也可能成為研究非加法式優化方法時的一個基準。因為它不是只在局部修修補補，而是直接重新定義了「更新」這件事。\u003C\u002Fp>\u003Cp>不過，從目前資料也能看出它的限制。第一，沒有 benchmark 數字，沒辦法判斷實際優勢幅度。第二，沒有訓練成本、記憶體成本、或導入難度的資訊。第三，也不知道它在現有訓練堆疊裡是不是容易替換 Adam、Muon 這些常見方案。\u003C\u002Fp>\u003Cp>所以比較務實的讀法，不是「明天就把 Adam 換掉」，而是把 Pion 當成一個值得關注的新方向：如果你想在訓練 LLM 時保留某些矩陣性質，這種正交變換式優化器提供了一個不同的答案。它未必是萬用解，但它確實把優化器的設計空間往前推了一步。\u003C\u002Fp>\u003Ch2>這篇論文的重點整理\u003C\u002Fh2>\u003Cp>Pion 的核心貢獻，可以濃縮成一句話：它用正交等價變換來更新 LLM 權重，並把奇異值保留下來。這讓它和主流加法式優化器走出不同路線，也讓「訓練時要保留\u003Ca href=\"\u002Fnews\u002Fwhy-anthropic-200b-google-cloud-pledge-changes-ai-race-zh\">什麼\u003C\u002Fa>結構」變成一個更具體的問題。\u003C\u002Fp>\u003Cp>從現有摘要來看，這篇論文同時有方法、分析、和初步實證三個面向。它提出更新規則，也討論收斂與性質，並聲稱在 pretraining 與 finetuning 上有穩定且具競爭力的表現。只是，因為摘要沒有公開完整 benchmark 細節，現在還不能把它解讀成壓倒性的勝利。\u003C\u002Fp>\u003Cp>對\u003Ca href=\"\u002Ftag\u002F台灣開發者\">台灣開發者\u003C\u002Fa>來說，這類研究最值得注意的地方，不是某個單一數字，而是它提醒我們：優化器的設計還有很多空間。當大家都在調學習率、batch size、warmup 的時候，有人已經在改寫「權重更新」本身的規則。\u003C\u002Fp>\u003Cp>如果你在追 LLM 訓練方法、矩陣幾何、或非標準優化器，Pion 是一篇值得放進閱讀清單的論文。它不是在做華麗包裝，而是在嘗試把模型訓練的底層操作，改成一種保留譜結構的新方式。\u003C\u002Fp>","Pion 是一種新的 LLM 優化器，改用左右正交變換更新權重，盡量保留奇異值不變。這篇論文主打的是訓練時維持矩陣譜結構，而不是只追求一般的加法式梯度更新。","arxiv.org","https:\u002F\u002Farxiv.org\u002Fabs\u002F2605.12492",null,"https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778653862621-6bth.png",[13,14,15,16,17,18],"LLM optimizer","orthogonal transformation","singular values","spectrum-preserving","pretraining","finetuning","zh",1,false,"2026-05-13T06:30:29.035095+00:00","2026-05-13T06:30:28.958+00:00","done","99a3144d-4f7d-400a-a25e-75a31685ffad","pion-spectrum-preserving-optimizer-llms-zh","research","b563114c-8592-4aff-88b2-54ef64cc51fc","published","2026-05-13T09:00:10.169+00:00",[32,33,34],"Pion 用左右正交變換更新權重，而不是直接做加法更新。","它的設計目標是保留權重矩陣的奇異值，也就是維持譜結構。","摘要聲稱它可用於 LLM pretraining 與 finetuning，且表現穩定、具競爭力，但沒有公開完整 benchmark 細節。","a3d63458-3418-4405-881d-35f17d0280f4","[-0.011562072,0.0053212233,0.03132356,-0.09281722,-0.015965158,0.00750495,-0.022473428,0.008597306,-0.00089532445,-0.0053574154,-0.0065800576,-0.037199818,0.011461242,-0.015409232,0.13439347,0.023734037,-0.01879588,0.00013830217,0.028808517,-0.014744019,-0.007164852,-0.002437585,-0.012111187,0.025646431,0.012917176,0.011812334,-0.0012335124,-0.00040325444,0.05692052,-0.003317185,0.012258053,0.040604457,0.027031232,0.03098758,-0.0015753179,0.015802791,0.022387264,0.0019777052,-0.010914254,0.019724289,0.00213729,0.0016243369,0.021368122,-0.002072968,-0.022246487,-0.0046008155,-0.008870054,-0.04430434,-0.019500421,0.017175889,-0.020470554,0.023053711,-0.013939076,-0.14411677,0.016359052,0.011456517,-0.0087868655,0.004110195,0.01281141,0.0067609143,-0.02177288,0.014336963,0.02178855,0.006077559,-0.01061064,-0.04713074,0.037753522,0.0141157955,-0.011470006,-0.016050143,0.019870041,-0.0010593499,0.016841864,-0.004575477,0.023510788,-0.055303976,0.026571453,0.0033437698,0.023637265,0.010889459,-0.0072184848,-0.03092486,-0.00039029054,0.018526476,-0.011837793,0.00010189616,0.03243348,0.003710728,0.008874386,-0.004063666,-0.008793851,0.0027766791,0.020569654,-0.008111622,0.010445299,0.013914416,-0.009637138,-0.0030217916,-0.002546303,-0.013898081,-0.0065896083,-0.012661433,0.002166289,0.016379757,0.001658573,0.012737214,0.02025904,0.00013038878,0.00944612,0.0020451471,-0.0155277755,-0.006507747,0.0055183913,-0.019164355,0.031560212,-0.14031951,0.013359178,0.0010321633,-0.01797567,-0.008914739,0.01589854,0.009183974,0.034524497,0.038429502,0.013001515,0.007276997,-0.010054651,-0.005247213,-0.0076000607,-0.008426927,-0.01718649,0.0010337506,-0.019935066,-0.0050856513,0.025440857,0.030742584,0.00073281827,-0.010741781,-0.014452574,-0.0095071085,-0.026610702,0.01297691,-0.012348376,0.026174543,-0.037188727,-0.023726273,-0.04607522,-0.012449801,0.016569993,-0.0053217486,0.009911985,-0.018224541,-0.014715779,-0.020282263,0.026101578,-0.026757391,0.01581628,0.018010397,0.02468681,0.0036084177,0.019141432,-0.0153847085,-0.012784006,-0.0055768834,0.0148315085,0.0011353843,-0.026989518,-0.0076591405,0.021007083,0.020870864,0.019934766,0.004248897,-0.015353682,0.0038995757,0.027641991,-0.019525923,-0.035477426,0.010191309,0.0076530064,-0.0024499537,0.018950323,0.03428447,0.0065962556,0.006432233,0.003077901,0.00014855283,0.0055514174,0.019643735,-0.002806468,0.035207465,-0.0105252145,-0.006006454,-0.0175775,-0.0026588247,-0.013360926,-0.01528277,-0.0024939876,-0.00090829964,-0.009144332,-0.004868413,-0.014598166,-0.024336502,0.019466769,-0.009741324,0.011373999,-0.0067379256,0.00647733,-0.032025877,0.015990179,0.017651316,-0.0047397977,0.011811082,0.011243147,-0.01822607,0.009205294,-0.0047983523,-0.0050717513,0.015642362,-0.008857441,-0.014195784,0.014988802,-0.015529557,-0.015325113,-0.023764817,-0.01166931,-0.0067627234,-0.011085083,0.0025021308,-0.005249683,-0.0057326015,-0.003360714,0.012150009,0.02948818,0.00038243772,0.014056213,0.019478457,-0.015436585,0.0018639959,0.0027175138,0.01689909,-0.00873719,0.0025671637,-0.012573984,-0.0071987123,0.025922993,-0.01778039,0.011974165,0.018284922,0.016797002,0.034656227,0.014806716,-0.008974426,0.007484413,0.0015184421,-0.00937243,0.00022984963,-0.0071374374,-0.023250064,0.0013280099,0.022613907,-0.00019499709,0.00062938605,0.005706354,0.011278399,0.002052752,-0.020412872,0.005170272,-0.0008020144,-0.010482989,0.02789856,-0.011799013,0.01244115,0.02430373,-0.0021982652,0.011288959,-0.033324216,-0.04679777,0.034644887,0.004313198,-0.014466571,0.01935515,0.008999926,0.034496713,0.013003487,-0.014932594,0.019422904,-0.029092254,-0.007448146,-0.011969927,-0.0072925203,-0.005292218,0.025712062,-0.020828974,-0.0058395914,-0.0039157057,-0.042052213,-0.00021821217,-0.0034771655,0.0017241595,0.01156191,-0.0023093577,-0.012748868,0.027539006,0.04607159,0.0022269099,0.028119937,-0.011784838,0.02227112,-0.008094489,0.00561769,0.005477089,0.0030114998,-0.004755434,-0.005686144,-0.047321755,-0.010043626,0.03054718,-0.009010333,0.013285567,0.010195707,0.0146862315,-0.044763982,-0.023963228,-0.0035735648,-0.008710457,-0.019698586,-0.017008873,-0.020713411,0.009222543,0.0002490984,-0.004736068,0.027448125,0.01853591,-0.020814281,-0.0010654345,-0.017158285,-0.011578082,-0.00026870798,-0.030793473,-0.010049868,-0.009464491,-0.01416175,0.008177711,-0.0017404859,-0.023444574,0.012482432,0.009836705,0.01306405,0.014454961,-0.021936873,0.008559748,-0.024326928,0.021010445,-0.009980982,-0.016005656,0.03720111,-0.025397148,-0.007637742,0.021662913,-0.028758317,0.023863073,-0.027612492,0.004924164,0.0019846077,0.039151996,-0.04202258,0.0074104024,0.01836334,-0.010820509,-0.005694356,0.017445084,-0.004159696,-0.03305672,-0.007641327,-0.0050270515,0.010376301,-0.017108107,-0.014207574,-0.019760618,-0.012298406,-0.004297987,0.038147416,-0.0039578145,-0.009201445,0.010261813,-0.006898511,0.0065557864,0.011628439,0.0040083895,0.0022513939,-0.01753387,0.0153273735,0.0049868505,-0.005425175,0.0012832329,0.024204183,-0.013710012,0.01659778,-0.011734773,-0.00076046784,0.017093988,0.008539318,0.012393076,-0.030156398,0.00826546,0.00394547,0.008952068,-0.00076516235,-0.009542964,-0.044670396,0.015357165,0.026420329,-0.0021435649,-0.0076214727,-0.0021675292,0.013768704,-0.015011716,-0.024980692,0.03377496,-0.009693807,-0.003198972,0.0015225542,-0.006370631,-0.004898324,0.010376704,-0.02444658,-0.015086349,-0.022834476,0.0212188,-0.020618176,0.008259599,-0.024498696,-0.0053343694,-0.013607784,-0.009470336,0.00084965315,0.0025983904,0.024692582,-0.013399026,-0.042464525,-0.016764343,-0.0016745554,0.0025595394,0.010607934,0.030684156,-0.017936077,0.01845962,-0.013337051,-0.035595637,0.028657872,0.03731346,0.010752239,0.01736175,-0.009966644,-0.004936834,-0.0075598615,0.0052575613,0.012803115,0.00070269936,-0.017425973,-0.016099868,-0.035657126,0.026258186,0.017197616,-0.016030993,-0.044134196,-0.020718561,0.009854136,-0.0067935428,0.023670783,-0.0072625945,-0.004482409,0.0106363045,0.009793623,0.0046634874,-0.0049561406,-0.017240249,-0.01290721,0.014181598,0.018240156,0.012348931,-0.037616443,0.014320777,-0.008747912,0.024472557,0.0015982286,-0.017190622,0.027825948,0.0028262974,-0.002207884,0.024901807,0.0064606005,0.025813688,-0.008064031,-0.0053376057,-0.009256032,-0.0111992825,-0.03192623,0.0022302454,0.01506896,0.010244987,-0.026511306,0.023997508,-0.0064361948,-0.007867461,-0.015297016,0.0073637473,-0.014662632,0.02591996,-0.013377545,-0.016747665,0.015442211,-0.012266727,0.052657668,-0.0054837004,-0.010039588,0.01132458,0.027710661,-0.014463022,-0.00837758,-0.009419467,-0.011654855,-0.0039938106,0.023928495,-0.01085241,-0.019108351,0.00857765,0.0044648047,0.015126419,-0.009102414,-0.025679331,0.031989932,0.013022741,-0.024213048,-0.011944499,-0.020697199,0.013144546,0.015412397,-0.0013252875,-0.0053274,-0.0006157379,0.015591805,0.013203405,0.0018200916,0.03169408,-0.08450442,0.017486516,0.013508856,0.0110458145,0.010812947,-0.021665405,0.00087026093,-0.004769063,0.008831278,-0.0055023693,0.0055178204,0.0033136583,0.013772514,0.014225855,0.0037143375,-0.034254596,-0.007813378,-0.025885612,0.0050840853,0.00064192765,0.02584899,0.017091898,-0.011270865,0.022724165,-0.027656265,-0.01736407,0.02397564,0.011324273,0.017016463,0.01459271,-0.01736725,-0.017972741,0.0058021285,0.0011774321,0.0074077006,0.03383839,0.014407208,0.0061298083,0.024931015,-0.0035142028,0.024740385,0.016900688,-0.005976624,-0.0050156326,0.008523009,-0.0062330314,-0.011240323,0.016848773,0.0042477343,-0.009206194,-0.011199561,-0.0066463556,-0.024590159,0.0007557372,-0.01548159,-0.020963639,-0.018547876,-0.0032605084,-0.009417233,0.026066108,0.004525302,0.004088864,-0.026366537,0.032009106,-0.010853651,0.008715733,-0.010439926,0.045591913,-0.0026040706,-0.00185631,0.0008208735,0.0039618807,0.02253516,-0.020783788,-0.034746747,0.004005153,-0.006642904,0.008040705,-0.0049221073,-0.0027382663,-0.0042995196,-6.337517e-05,-0.080799855,-0.008218287,-0.0028983506,-0.005839401,0.029104153,0.0030964427,0.0020218173,-0.024976986,-0.005384073,-0.047136504,0.002974577,-0.009540261,-0.005864707,-0.0563341,-0.018711526,0.018952733,-0.014325167,-0.019103587,-0.03147544,0.0023617302,0.015124561,-0.008053294,0.029512748,0.002114554,-0.011452505,0.016400758,-0.007027557,-0.021923602,-0.0131150745,0.0013341733,0.012938476,-0.1225361,0.0056385724,0.0022022743,-0.02690458,0.02515807,-0.033642154,-0.01859279,-0.011314257,0.006998544,-0.0059751,-0.014743861,-0.036936514,-0.02969505,-0.006023384,-0.006591032,0.12609561,0.0020218056,0.007339878,-0.013404533,-0.041308977,-0.006612475,-0.01536856,-0.0067117563,0.016320106,0.013503328,-0.0129272835,0.008311853,-0.039169636,-0.018415475,0.025826884,0.004022798,0.015729448,0.0045251525,-0.02263904,-0.007669943,-0.006227731,0.0034276806,-0.024944102,0.015700785,0.019149287,0.012992917,0.018096615,0.011460588,-0.030620981,-0.027907902,-0.03160566,-0.013343747,-0.0067524156,0.0040735835,-0.0027592438,-0.01855755,-0.05189117,0.01396983,-0.018555745,0.012483277,-0.03606068,-0.012960733,0.019458821,-0.0033585837,0.010847673,0.018217176,0.0120390635,-0.0047097374,0.033687297,-0.008515498,-0.016166698,0.023254765,0.012430098,-0.0056262254,0.018359398,0.0043869447,-0.011305048,-0.0081274165,-0.03186608,-0.01262748,-0.010750421,0.010150245,-0.0013899076,0.036858443,0.010153412,-0.009848423,0.020760959,-0.018568402,0.009688374,-0.002633964,0.0013386528,0.0015931777,-0.008500524,-0.007778344,-0.010061549,-0.006008159,-0.007256374,-0.011563044,-0.006128521,-0.020179953,0.007018633,-0.009010517,-0.010072126,0.0040584537,-0.0050003524,0.005291403,-0.029149491,-0.022236908,-0.011465691,-0.027851544,0.011595963,-0.0036652482,0.009504293,0.020030182,-0.005727251]",true,[39,40,42,44,45],{"name":17,"slug":17},{"name":13,"slug":41},"llm-optimizer",{"name":15,"slug":43},"singular-values",{"name":16,"slug":16},{"name":14,"slug":46},"orthogonal-transformation",{"id":28,"slug":48,"title":49,"language":50},"pion-spectrum-preserving-optimizer-llms-en","Pion keeps LLM weights’ spectrum fixed","en",[52,58,64,70,76,82],{"id":53,"slug":54,"title":55,"cover_image":56,"image_url":56,"created_at":57,"category":27},"667b72b6-e821-4d68-80a1-e03340bc85f1","turboquant-seo-shift-small-sites-zh","TurboQuant 與小站 SEO 變化","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778840440690-kcw9.png","2026-05-15T10:20:27.319472+00:00",{"id":59,"slug":60,"title":61,"cover_image":62,"image_url":62,"created_at":63,"category":27},"381fb6c6-6da7-4444-831f-8c5eed8d685c","turboquant-vllm-comparison-fp8-kv-cache-zh","TurboQuant 與 FP8 實測結果","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778839867551-4v9g.png","2026-05-15T10:10:36.034569+00:00",{"id":65,"slug":66,"title":67,"cover_image":68,"image_url":68,"created_at":69,"category":27},"c15f45ee-a548-4dbf-8152-91de159c1a11","llmbda-calculus-agent-safety-rules-zh","LLMbda 演算替 AI 代理人立安全規則","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778825503412-mlbf.png","2026-05-15T06:10:34.832664+00:00",{"id":71,"slug":72,"title":73,"cover_image":74,"image_url":74,"created_at":75,"category":27},"0c02225c-d6ff-44f8-bc92-884c8921c4a3","low-complexity-beamspace-denoiser-mmwave-mimo-zh","更簡單的毫米波波束域去噪器","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778814650361-xtc2.png","2026-05-15T03:10:30.06639+00:00",{"id":77,"slug":78,"title":79,"cover_image":80,"image_url":80,"created_at":81,"category":27},"9d27f967-62cc-433f-8cdb-9300937ade13","ai-benchmark-wins-cyber-scare-defenders-zh","為什麼 AI 基準賽在資安領域的勝利，應該讓防守方警醒","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778807450006-nofx.png","2026-05-15T01:10:29.379041+00:00",{"id":83,"slug":84,"title":85,"cover_image":86,"image_url":86,"created_at":87,"category":27},"bc402dc6-5da6-46fc-9d66-d09cb215f72b","why-linux-security-needs-patch-wave-mindset-zh","為什麼 Linux 安全需要「補丁浪潮」思維","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778741449813-s2wn.png","2026-05-14T06:50:24.052583+00:00",[89,94,99,104,109,114,119,124,129,134],{"id":90,"slug":91,"title":92,"created_at":93},"f18dbadb-8c59-4723-84a4-6ad22746c77a","deepmind-bets-on-continuous-learning-ai-2026-zh","DeepMind 押注 2026 連續學習 AI","2026-03-26T08:16:02.367355+00:00",{"id":95,"slug":96,"title":97,"created_at":98},"f4a106cb-02a6-4508-8f39-9720a0a93cee","ml-papers-of-the-week-github-research-desk-zh","每週 ML 論文清單，為何紅到 GitHub","2026-03-27T01:11:39.284175+00:00",{"id":100,"slug":101,"title":102,"created_at":103},"c4f807ca-4e5f-47f1-a48c-961cf3fc44dc","ai-ml-conferences-to-watch-in-2026-zh","2026 AI 研討會投稿時程整理","2026-03-27T01:51:53.874432+00:00",{"id":105,"slug":106,"title":107,"created_at":108},"9f50561b-aebd-46ba-94a8-363198aa7091","openclaw-agents-manipulated-self-sabotage-zh","OpenClaw Agent 會自己搞砸自己","2026-03-28T03:03:18.786425+00:00",{"id":110,"slug":111,"title":112,"created_at":113},"11f22e92-7066-4978-a544-31f5f2156ec6","vega-learning-to-drive-with-natural-language-instructions-zh","Vega：使用自然語言指示進行自駕車控制","2026-03-28T14:54:04.847912+00:00",{"id":115,"slug":116,"title":117,"created_at":118},"a4c7cfec-8d0e-4fec-93cf-1b9699a530b8","drive-my-way-en-zh","Drive My Way：個性化自駕車風格的實現","2026-03-28T14:54:26.207495+00:00",{"id":120,"slug":121,"title":122,"created_at":123},"dec02f89-fd39-41ba-8e4d-11ede93a536d","training-knowledge-bases-with-writeback-rag-zh","用 WriteBack-RAG 強化知識庫提升檢索效能","2026-03-28T14:54:45.775606+00:00",{"id":125,"slug":126,"title":127,"created_at":128},"3886be5c-a137-40cc-b9e2-0bf18430c002","packforcing-efficient-long-video-generation-method-zh","PackForcing：短影片訓練也能生成長影片","2026-03-28T14:55:02.688141+00:00",{"id":130,"slug":131,"title":132,"created_at":133},"72b90667-d930-4cc9-8ced-aaa0f8968d44","pixelsmile-toward-fine-grained-facial-expression-editing-zh","PixelSmile：提升精細臉部表情編輯的新方法","2026-03-28T14:55:20.678181+00:00",{"id":135,"slug":136,"title":137,"created_at":138},"cf046742-efb2-4753-aef9-caed5da5e32e","adaptive-block-scaled-data-types-zh","IF4：神經網路量化的聰明選擇","2026-03-31T06:00:36.990273+00:00"]