[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"article-why-fine-tuning-llms-domain-tasks-right-default-zh":3,"article-related-why-fine-tuning-llms-domain-tasks-right-default-zh":37,"series-research-50b2e74e-7248-43a3-8775-451bf2569f33":86},{"id":4,"title":5,"content":6,"summary":7,"source":8,"source_url":9,"author":10,"image_url":11,"keywords":12,"language":19,"translated_content":10,"views":20,"is_premium":21,"created_at":22,"updated_at":22,"cover_image":11,"published_at":23,"rewrite_status":24,"rewrite_error":10,"rewritten_from_id":25,"slug":26,"category":27,"related_article_id":28,"status":29,"google_indexed_at":30,"x_posted_at":10,"tweet_text":10,"title_rewritten_at":10,"title_original":10,"key_takeaways":31,"topic_cluster_id":35,"embedding":36,"is_canonical_seed":21},"50b2e74e-7248-43a3-8775-451bf2569f33","為什麼針對領域任務微調 LLM 才是預設選項","\u003Cp data-speakable=\"summary\">當 \u003Ca href=\"\u002Ftag\u002Fllm\">LLM\u003C\u002Fa> 必須在特定領域保持高準確與固定輸出時，微調比只靠提示工程更適合作為預設選項。\u003C\u002Fp>\u003Cp>我主張，面對醫療、法務、金融、客服這類有明確規則的任務，微調 LLM 應該是預設方案，而不是最後才補上的選項。原因很直接：通用模型擅長廣泛對話，卻不保證在專業語境裡穩定命中格式、術語與判斷標準。\u003C\u002Fp>\u003Cp>你可以把這件事理解成產品工程，而不是模型迷信。當任務輸入與輸出都相對固定時，真正有價值的不是模型「\u003Ca href=\"\u002Fnews\u002Fwhy-global-ai-regulation-2026-rewards-modular-compliance-zh\">什麼\u003C\u002Fa>都會一點」，而是它能不能在 95% 以上的重複情境中，維持一致、可驗證、可回歸測試的表現。\u003C\u002Fp>\u003Ch2>第一個論點：領域資料比通用廣度更有用\u003C\u002Fh2>\u003Cp>通用 LLM 的優勢是語言能力，但在專業場景裡，語言流暢不等於答案正確。像保險理賠分類、客服工單路由、合約條款標記這類任務，真正重要的是標籤一致性與術語對齊。你只要看過\u003Ca href=\"\u002Fnews\u002Fatlas-one-token-visual-reasoning-zh\">一個\u003C\u002Fa>模型把「取消續約」誤判成「退款申請」，就會知道這不是小錯，而是直接影響流程成本。\u003C\u002Fp>\n\u003Cfigure class=\"my-6\">\u003Cimg src=\"https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778916229431-9olk.png\" alt=\"為什麼針對領域任務微調 LLM 才是預設選項\" class=\"rounded-xl w-full\" loading=\"lazy\" \u002F>\u003C\u002Ffigure>\n\u003Cp>微調的價值，在於它會把正確案例的模式寫進模型權重，而不是每次都依賴臨場提示。這也是為什麼在文本分類、情緒判斷、資訊抽取這些任務上，微調後的模型常常比純提示更穩。資料集如果有 5,000 筆到 50,000 筆高品質標註，通常就足以讓模型學到領域邊界，效果往往比把 prompt 寫得更長更可靠。\u003C\u002Fp>\u003Cp>更重要的是，領域任務的失誤成本通常不對稱。醫療摘要少一個否定詞，法務審查漏掉一條例外條款，金融分類把高風險客戶歸錯類，這些都不是「再問一次」能補救的。微調不是追求炫技，而是把模型往可控、可重現的方向拉，這才符合真實產品的需求。\u003C\u002Fp>\u003Ch2>第二個論點：微調比硬拗通用模型更省成本\u003C\u002Fh2>\u003Cp>從零訓練一個模型，對多數團隊來說都不現實。微調的優勢是站在既有預訓練能力上做專精，你不用重建語言能力，只需要把成本花在你真正關心的任務上。這不只是省算力，也省標註、試錯與上線時間，對小團隊尤其關鍵。\u003C\u002Fp>\u003Cp>以實務節奏來看，通用模型加上複雜提示，常常會把成本轉移到工程端。你得維護更長的 prompt、更複雜的檢索流程、更多防呆規則，還要處理版本漂移與回歸測試。相較之下，一個針對固定任務微調過的模型，雖然前期要整理資料，但上線後的行為更單純，維運與測試也更容易標準化。\u003C\u002Fp>\u003Cp>這會直接影響產品迭代速度。當你要同時支援客服回覆、文件分類、欄位抽取三種任務時，與其把同一個通用模型硬拉去做所有事情，不如分別微調不同版本，讓每個模型專注在單一目標。對團隊來說，這代表更少的 prompt 債務，也代表更清楚的品質責任歸屬。\u003C\u002Fp>\u003Ch2>反方可能怎麼說\u003C\u002Fh2>\u003Cp>反對者最強的論點，是微調會讓系統變得更脆弱。若你的任務變化很快、標註資料又少，或工作本質上偏對話式與探索式，那麼先用提示工程、\u003Ca href=\"\u002Fnews\u002Flora-vs-qlora-vs-full-fine-tuning-zh\">RA\u003C\u002Fa>G 或工具調用，確實更彈性。這些方法不需要重新訓練，也比較容易快速改規則。\u003C\u002Fp>\n\u003Cfigure class=\"my-6\">\u003Cimg src=\"https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778916235022-9of4.png\" alt=\"為什麼針對領域任務微調 LLM 才是預設選項\" class=\"rounded-xl w-full\" loading=\"lazy\" \u002F>\u003C\u002Ffigure>\n\u003Cp>另一個合理擔憂是，微調可能把資料噪音一併學進去。若標註標準不一致，模型會把錯誤當成模式，最後得到一個在離線測試看起來不錯、上線卻常常失手的系統。對某些團隊來說，與其急著微調，不如先把\u003Ca href=\"\u002Ftag\u002F資料治理\">資料治理\u003C\u002Fa>做好，否則只是把問題從 prompt 層搬到權重層。\u003C\u002Fp>\u003Cp>但這些批評並沒有推翻微調，只是劃出它的邊界。我的立場很明確：當任務有穩定輸入、穩定輸出與明確評分標準時，微調仍然是最合理的預設方案。真正不該做的是把它當萬靈丹；如果需求本來就高度流動，那就先不要微調，這不是否定，而是選擇正確的工具。\u003C\u002Fp>\u003Ch2>你能做什麼\u003C\u002Fh2>\u003Cp>如果你是工程師，先建立一個可回歸測試的基準集，再決定要不要微調，別只靠主觀感覺判斷模型好不好。若你是 PM，要求團隊先定義成功指標與失敗案例，因為沒有標準答案的任務，根本不適合談微調。若你是創辦人，請把資料品質預算排在算力前面，因為模型效果通常不是被 \u003Ca href=\"\u002Ftag\u002Fgpu\">GPU\u003C\u002Fa> 限制，而是被標註與流程限制。\u003C\u002Fp>\u003Cp>最實際的做法是：先用通用模型跑出 baseline，再用真實業務資料做誤差分析，找出錯誤集中在哪些類型，最後才決定是否微調。當你能清楚說出「模型在哪裡失敗、失敗有多常見、修正後能帶來多少收益」時，微調就不再是技術偏好，而是可證明的產品決策。\u003C\u002Fp>","當 LLM 必須在特定領域做到高準確、固定格式與穩定輸出時，微調比只靠提示工程更適合作為預設選項。","aigrants.in","https:\u002F\u002Faigrants.in\u002Ftopics\u002Ffine-tuning-llms-for-domain-specific-tasks",null,"https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778916229431-9olk.png",[13,14,15,16,17,18],"LLM 微調","領域任務","提示工程","模型準確性","資料標註","產品工程","zh",2,false,"2026-05-16T07:23:32.255569+00:00","2026-05-16T07:23:32.241+00:00","done","894d3265-5051-4cb9-a674-e8b0d299e262","why-fine-tuning-llms-domain-tasks-right-default-zh","research","d3d5812b-849a-4a6e-8c8c-d859618bd4b2","published","2026-05-16T09:00:15.235+00:00",[32,33,34],"在穩定且可評分的領域任務中，微調通常比純提示更可靠。","微調的核心價值是把模型行為變得更一致、更可測試。","若任務變動快或資料不足，先用提示工程與檢索，不要急著微調。","0c35a120-52fc-41fc-afa3-d404eb934158","[-0.0149890715,0.010241513,0.008806817,-0.07719558,-0.032743327,-0.025907492,0.0074273488,0.005983526,0.013449535,-6.0367383e-05,-0.02160542,-0.017375797,0.012128954,-0.004934714,0.117590435,0.030855753,-0.010822147,-0.002194405,-0.001842835,-0.0121944565,0.009038782,0.016950577,-0.028606026,0.004173892,0.005503152,0.023001999,-0.0033049902,0.03800847,0.034970347,-0.0017391737,0.0005279081,0.00048496109,0.025730005,0.03252111,-0.026767604,0.0105481185,0.011092232,-0.0233462,0.028278755,0.011185744,-0.0052237576,-0.013989671,-0.0234745,-0.0189152,-0.0044031614,0.02277139,0.015217706,-0.033025287,-0.04609732,0.013980472,0.0002993277,0.0186438,0.015570885,-0.15640949,0.02628441,0.0038766656,1.12015005e-05,-0.009507891,0.0038647756,0.011081025,-0.016304139,0.038364295,0.014903659,-0.005642753,-0.0044375258,-0.025334952,0.0023665344,0.007683309,-0.0010536874,0.0106784105,-0.0021059457,-0.010201096,0.018791597,-0.015437444,0.006437574,-0.020219086,0.009281669,0.018391391,0.0052888035,-0.0011066754,0.019495407,-0.011006055,-0.0046584043,0.00046660585,-0.0095371725,0.013264722,0.005320833,-0.014571245,0.02322002,0.0065357867,-0.0012662616,0.0017372404,0.014668253,0.0057800408,0.005625033,-0.00052804936,-0.008001984,0.0010839886,0.020138357,-0.043706175,0.0010861662,-0.006281253,0.011776843,0.030615224,0.024366934,0.017831216,0.009184264,-0.0084705595,-0.02027456,0.006371906,0.011866468,0.00071486103,0.008395404,0.00042029796,-0.00014149919,-0.13376981,-0.0014100975,0.021958573,-0.0033403705,0.006408859,0.010669604,-0.003550378,0.022352574,0.042488653,-0.0072361506,0.0027972008,0.010445635,-0.020868089,-0.01802461,0.01054971,-0.030507334,0.007599827,0.001960294,0.00092471973,0.0078862365,0.026533976,-0.008320444,-0.0015119914,-0.031644426,-0.03365357,0.0066472744,0.031634513,0.02937108,0.015152817,-0.01707334,-0.029224329,-0.024213536,0.009231309,0.013901231,0.013810305,0.011592218,0.0075742668,0.0014366582,-0.007114053,0.038239095,-0.015434113,0.009763369,0.012939746,0.022864489,0.00022885679,-0.015169608,-0.0138353985,-0.0040203724,0.0045335516,0.00069134345,0.016810037,-0.008570314,-0.0058805477,-0.026735803,0.0037115829,0.0286979,-0.0050841696,-0.008946983,-0.012302147,0.01618189,-0.0013389206,-0.019081023,-0.019552415,0.013126406,0.0036216995,0.028941335,0.004321554,-0.0068894164,0.020655548,-0.020511393,0.00032316535,-0.017343577,0.0065462105,0.010459105,0.028011898,-0.0290919,0.0063387565,0.017066294,0.0048174034,-0.012479464,-0.00026468735,0.004403512,-0.0054283612,-0.0034186163,-0.0028514066,-0.00036173678,-0.013170433,0.016030524,-0.0023184714,0.031899888,-0.036108613,0.010222019,0.001569791,0.026968269,-0.014846795,-0.007074348,0.0051795733,0.019274337,0.0075057033,-0.012880194,-0.017219927,0.01856093,-0.011914328,0.020632409,-0.006745819,0.019566635,-0.01651581,0.024814708,-0.001752833,-0.042162575,-0.01336269,-0.007395454,-0.023912525,0.008917551,0.027001536,-0.0030382026,-0.0029263827,0.038318522,-0.0070492863,0.034974787,0.017313914,0.009889082,0.009334029,0.005495588,0.0296654,-0.038906813,-0.006866873,0.0025817112,0.0053003123,0.032569375,-0.009973147,0.0105043845,-0.018207496,0.0019472818,0.01867159,0.00025562625,-0.013454082,-0.017774722,0.010365382,0.0048005646,-0.025447844,0.015698174,-0.0067864303,-0.00075425516,0.01689529,-0.014665843,0.01735936,0.00011408355,-0.018250443,0.025987392,0.0035128149,0.008893779,-0.01275648,-0.03402074,0.00027798064,-0.01015586,0.005881264,0.019743208,-0.008277459,0.009949025,-0.0011010799,-0.044413622,0.032438874,0.005544632,-0.009855721,-0.0058226176,0.0016498603,0.0464642,0.025785612,-0.01054913,0.016068047,-0.008179039,0.0011059277,0.004262435,-0.01757314,-0.022023788,0.048484493,-0.02891888,-0.009730534,-0.015933676,-0.0031064968,-0.010612443,0.014002585,-0.012238326,0.00046852103,-0.0011120321,0.0053870524,0.0064068376,0.059975207,-0.019195883,0.003219708,0.006988253,0.038763322,-0.028678259,-0.040134728,0.011871698,-0.042771578,0.009719329,-0.0071622096,-0.02026889,-0.008373442,0.02507319,-0.0040240088,0.01275476,1.152362e-05,0.004681072,-0.008549869,0.0024686735,0.008402293,-0.0064910217,-0.007653843,-0.007462352,-0.0040943963,0.022271557,-0.006264725,-0.017401477,0.0064975736,0.02781364,-0.014048863,0.011206929,0.012719341,-0.019367332,-0.006848697,-0.023026673,-0.026962228,-0.00654875,-0.016897408,-0.0069567505,0.022388346,0.0044409363,0.037757114,-0.00041119056,0.02456657,-0.019988148,-0.020517627,0.012137735,0.0066423356,-0.0029454487,-0.029512113,-0.018810509,0.017054047,-0.022288905,-0.02757702,0.022720462,-0.00471228,0.012082193,0.0078863455,-0.0063164304,0.012564157,0.043272942,-0.031162534,-0.02928744,0.026351647,-0.011463684,0.022373386,0.012829303,-0.010735557,-0.019895023,0.003762713,0.004998387,0.017836904,-0.033899423,-0.016890474,1.2290527e-05,-0.0024506503,-0.013089306,0.040827792,-0.023769116,-0.012461019,-0.009367349,0.006389875,0.04100583,0.015780564,-0.009853524,0.0117126135,0.014519098,0.008942382,0.009165205,0.011889773,0.002518981,-0.008685286,-0.0048866128,0.01216771,-0.014731249,-0.0043829596,0.01601123,0.01491053,0.016068093,-0.0013294674,0.0120540615,-0.005438988,-0.009290622,0.031493597,-0.020481858,-0.0180818,0.009775579,0.011444926,-0.012806034,-0.011044092,0.050968293,-0.00052537763,-0.0268818,0.037074335,0.013728196,0.010328664,0.012982728,-0.006507573,0.006192207,0.01311159,-0.020689704,-0.024060296,-0.019734982,-0.0067352494,0.018891422,-0.052727204,-0.00065079,0.0068663377,-0.021694016,-0.016127998,-0.011910087,-0.0038456302,-0.026746316,0.0123598315,0.0011499092,-0.021558462,0.012776577,0.004577907,-0.0030391356,-0.024452053,0.0015180091,0.0027890275,-0.011774859,-0.004684738,-0.014619542,-0.0013855088,0.058856808,0.017494245,0.022551727,-0.008468368,0.00730258,-0.019744905,-0.012640408,0.020480251,0.010297357,-0.022783967,-0.005554023,-0.021731285,0.025282543,0.024939446,-0.012950328,-0.024529299,0.00068747555,0.0032607024,-0.005120308,0.027505238,0.024953308,-0.004497588,0.020407844,0.017398365,-0.012152527,0.0034677256,0.00836606,0.0053537763,-0.021172162,0.0061657894,0.01039067,-0.031181762,0.025644043,-0.020389032,-0.016678875,-0.012278755,-0.023166597,-0.0004966423,0.004864033,0.011075787,0.0056583947,0.026491964,0.021190789,-0.0018085968,-0.031163964,-0.015121165,-0.0020067187,-0.027771547,-0.030569026,0.029276649,0.0013165062,-0.007680537,-0.011780863,-0.02443948,0.014701824,-0.017881144,-0.02190772,-0.00051415694,-0.024264326,0.0039457404,-0.035271604,0.008658752,2.0678775e-05,0.020649014,0.0065897033,-0.0004776904,0.018225467,0.01688077,-0.016342252,-0.0276061,0.004599057,-0.0016761314,-0.019257562,0.010647087,-0.018343274,-0.009255032,0.050886482,0.020407705,0.034486085,0.032017134,-0.0062308195,0.0042232494,0.0032341818,-0.016795512,-0.0059438604,-0.016629823,0.013034494,0.029316882,-0.013312049,0.0063343546,0.0011025007,-0.015020472,0.01701417,-0.018259771,0.017012043,-0.09538917,0.005573538,0.0036696298,0.0024359466,0.01355374,-0.00050948776,-0.0016648358,-0.016432682,0.0019526989,0.014666394,0.026388468,-0.025477434,0.006637998,0.019883392,-0.014760333,-0.017635822,-0.011116629,-0.016012752,0.029762028,-0.020250304,0.029449934,0.0027923118,0.0027991543,0.02604069,-0.0055707274,-0.004692434,0.017039519,0.012186803,-0.0093160905,-0.0009269066,-0.021864636,-0.02931293,-0.026319748,0.0044485563,0.0063952478,-0.025662022,0.026071737,-0.0043318183,0.029901672,0.008893831,0.0028087245,-0.0057113026,-0.009659407,-0.029332522,0.0016070532,0.016402638,-0.020942219,-0.0020924693,0.01663708,0.014807386,-0.012685308,-0.024586946,0.014838297,-0.006219981,-0.020112846,0.008280145,-0.014902012,-0.0033377167,0.007279409,0.01903957,-0.023278382,-0.023803622,-0.037832305,0.039528675,-0.037173774,-0.0025797044,-0.012317117,0.011422222,-0.009893524,0.02934203,0.009003238,-0.01645245,0.0025364703,-0.005080492,-0.028960198,0.0006694599,-0.028091917,-0.001398296,-0.0024943405,0.0080488315,-0.008255829,0.007522511,-0.06684126,-0.012804413,0.0041644843,-0.008385158,0.0027011186,-0.031454254,0.03952247,-0.0073637906,-0.006793383,-0.0048312964,0.0035195732,-0.05166036,-0.0102284085,-0.023880268,0.024413288,0.03590109,-0.001967572,0.009028175,-0.00694176,-0.03967187,0.00645908,-0.0020063878,0.011037952,0.008833051,-0.015449674,0.0059953216,-0.005812931,0.006887612,-0.008037357,-0.0055407113,-0.003088,-0.1461478,-0.004045571,-0.0074060643,0.008248492,0.009982896,-0.00057336077,-0.018832304,0.0008754123,0.023863122,-0.014916113,-0.008361821,-0.05150758,-0.005860767,-0.02715872,-0.014522394,0.107982874,-0.01107212,0.012408149,-0.012534401,-0.023205789,-0.017556217,-0.04158397,-0.012382578,0.009086283,0.002858128,0.002249376,0.0067133964,-0.034671836,-0.007768544,0.021299733,-0.0018128775,-0.0051556095,-0.0014633333,-0.0047702813,0.02038977,-0.009552604,-0.020351771,-0.023655167,0.0016749374,0.0034805124,0.0017248049,0.010685049,0.0007175938,-0.007387189,-0.037912242,-0.010552385,-0.019681485,-0.010924714,-0.0133004915,-0.008486766,-0.029699983,-0.059543498,0.008087574,-0.011614279,0.014190297,0.0026533345,-0.012546922,0.015287504,0.007693563,0.012132848,0.028301353,-0.0074971253,0.004199102,0.010475925,-0.019607192,-0.005673313,0.013885317,0.022976974,0.0007399549,0.00661565,0.00349639,-0.0012200393,0.0038053433,0.0051233196,-0.015909953,-0.03426822,0.023032181,0.009031933,0.04109957,0.0076201367,0.020160474,0.014390019,-0.015482233,-0.003907765,-0.013950085,0.0068314048,0.022372052,0.0002851664,0.022565639,-0.00866734,0.005753366,0.033041663,-0.017418005,0.018823063,0.015603028,0.013773067,0.004184951,0.021223374,0.007448057,-0.019071154,-0.0025353676,0.003206242,0.010593966,0.0021120578,0.0052593886,0.020674806,-0.0005821927,0.021918496,0.019709468,-0.0072876853]",{"tags":38,"relatedLang":45,"relatedPosts":49},[39,40,42,43,44],{"name":14,"slug":14},{"name":13,"slug":41},"llm-微調",{"name":16,"slug":16},{"name":17,"slug":17},{"name":15,"slug":15},{"id":28,"slug":46,"title":47,"language":48},"why-fine-tuning-llms-domain-tasks-right-default-en","Why fine-tuning LLMs for domain tasks is the right default","en",[50,56,62,68,74,80],{"id":51,"slug":52,"title":53,"cover_image":54,"image_url":54,"created_at":55,"category":27},"6ca303f0-7bd4-4bb2-be58-70d80da5ec40","why-ai-safety-teams-are-wrong-blame-only-alignment-zh","為什麼 AI 安全團隊錯把問題全怪在對齊","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778947417022-ak55.png","2026-05-16T16:03:16.319335+00:00",{"id":57,"slug":58,"title":59,"cover_image":60,"image_url":60,"created_at":61,"category":27},"001e062e-f246-4bf0-aa04-27506febcf7b","refdecoder-reference-conditioned-video-decoder-zh","RefDecoder 讓影片解碼器吃參考圖","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778912646805-czy9.png","2026-05-16T06:23:33.170076+00:00",{"id":63,"slug":64,"title":65,"cover_image":66,"image_url":66,"created_at":67,"category":27},"b9516feb-41d5-42a3-887e-7b47c5c9ffb7","atlas-one-token-visual-reasoning-zh","ATLAS 用一個 token 做視覺推理","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778912032775-hp0w.png","2026-05-16T06:13:34.693651+00:00",{"id":69,"slug":70,"title":71,"cover_image":72,"image_url":72,"created_at":73,"category":27},"bfd03801-a200-4222-9370-8b441be41483","entitybench-long-range-video-consistency-zh","EntityBench 盯住長片一致性","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778911845686-4mc8.png","2026-05-16T06:10:27.85068+00:00",{"id":75,"slug":76,"title":77,"cover_image":78,"image_url":78,"created_at":79,"category":27},"667b72b6-e821-4d68-80a1-e03340bc85f1","turboquant-seo-shift-small-sites-zh","TurboQuant 與小站 SEO 變化","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778840440690-kcw9.png","2026-05-15T10:20:27.319472+00:00",{"id":81,"slug":82,"title":83,"cover_image":84,"image_url":84,"created_at":85,"category":27},"381fb6c6-6da7-4444-831f-8c5eed8d685c","turboquant-vllm-comparison-fp8-kv-cache-zh","TurboQuant 與 FP8 實測結果","https:\u002F\u002Fxxdpdyhzhpamafnrdkyq.supabase.co\u002Fstorage\u002Fv1\u002Fobject\u002Fpublic\u002Fcovers\u002Finline-1778839867551-4v9g.png","2026-05-15T10:10:36.034569+00:00",[87,92,97,102,107,112,117,122,127,132],{"id":88,"slug":89,"title":90,"created_at":91},"f18dbadb-8c59-4723-84a4-6ad22746c77a","deepmind-bets-on-continuous-learning-ai-2026-zh","DeepMind 押注 2026 連續學習 AI","2026-03-26T08:16:02.367355+00:00",{"id":93,"slug":94,"title":95,"created_at":96},"f4a106cb-02a6-4508-8f39-9720a0a93cee","ml-papers-of-the-week-github-research-desk-zh","每週 ML 論文清單，為何紅到 GitHub","2026-03-27T01:11:39.284175+00:00",{"id":98,"slug":99,"title":100,"created_at":101},"c4f807ca-4e5f-47f1-a48c-961cf3fc44dc","ai-ml-conferences-to-watch-in-2026-zh","2026 AI 研討會投稿時程整理","2026-03-27T01:51:53.874432+00:00",{"id":103,"slug":104,"title":105,"created_at":106},"9f50561b-aebd-46ba-94a8-363198aa7091","openclaw-agents-manipulated-self-sabotage-zh","OpenClaw Agent 會自己搞砸自己","2026-03-28T03:03:18.786425+00:00",{"id":108,"slug":109,"title":110,"created_at":111},"11f22e92-7066-4978-a544-31f5f2156ec6","vega-learning-to-drive-with-natural-language-instructions-zh","Vega：使用自然語言指示進行自駕車控制","2026-03-28T14:54:04.847912+00:00",{"id":113,"slug":114,"title":115,"created_at":116},"a4c7cfec-8d0e-4fec-93cf-1b9699a530b8","drive-my-way-en-zh","Drive My Way：個性化自駕車風格的實現","2026-03-28T14:54:26.207495+00:00",{"id":118,"slug":119,"title":120,"created_at":121},"dec02f89-fd39-41ba-8e4d-11ede93a536d","training-knowledge-bases-with-writeback-rag-zh","用 WriteBack-RAG 強化知識庫提升檢索效能","2026-03-28T14:54:45.775606+00:00",{"id":123,"slug":124,"title":125,"created_at":126},"3886be5c-a137-40cc-b9e2-0bf18430c002","packforcing-efficient-long-video-generation-method-zh","PackForcing：短影片訓練也能生成長影片","2026-03-28T14:55:02.688141+00:00",{"id":128,"slug":129,"title":130,"created_at":131},"72b90667-d930-4cc9-8ced-aaa0f8968d44","pixelsmile-toward-fine-grained-facial-expression-editing-zh","PixelSmile：提升精細臉部表情編輯的新方法","2026-03-28T14:55:20.678181+00:00",{"id":133,"slug":134,"title":135,"created_at":136},"cf046742-efb2-4753-aef9-caed5da5e32e","adaptive-block-scaled-data-types-zh","IF4：神經網路量化的聰明選擇","2026-03-31T06:00:36.990273+00:00"]