<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>Pretraining on 极客老墨</title>
    <link>https://blog.hankmo.com/tags/pretraining/</link>
    <description>Recent content in Pretraining on 极客老墨</description>
    <generator>Hugo -- 0.138.0</generator>
    <language>zh-cn</language>
    <lastBuildDate>Mon, 04 May 2026 00:00:00 +0000</lastBuildDate>
    <atom:link href="https://blog.hankmo.com/tags/pretraining/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>大模型是怎么炼成的：从训练数据到你手里的 API</title>
      <link>https://blog.hankmo.com/llm-how-it-works/</link>
      <pubDate>Mon, 04 May 2026 00:00:00 +0000</pubDate>
      <guid>https://blog.hankmo.com/llm-how-it-works/</guid>
      <description>大模型怎么造出来的，又是怎么工作的？本文把&amp;#34;训练数据→预训练→微调对齐→部署→推理&amp;#34;这条完整链路拆开来讲，搞清楚一个模型从无到有、再到你手里 API 调用的全过程。</description>
    </item>
  </channel>
</rss>
