<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>DeepSeek on 极客老墨</title>
    <link>https://blog.hankmo.com/tags/deepseek/</link>
    <description>Recent content in DeepSeek on 极客老墨</description>
    <generator>Hugo -- 0.138.0</generator>
    <language>zh-cn</language>
    <lastBuildDate>Fri, 20 Feb 2026 00:00:00 +0000</lastBuildDate>
    <atom:link href="https://blog.hankmo.com/tags/deepseek/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>2026马年春节，我用AI帮我写了一个嘴替小程序</title>
      <link>https://blog.hankmo.com/2026-ai-ma-zuiti/</link>
      <pubDate>Fri, 20 Feb 2026 00:00:00 +0000</pubDate>
      <guid>https://blog.hankmo.com/2026-ai-ma-zuiti/</guid>
      <description>3天时间，用AI全流程开发一个春节嘴替小程序：从产品设计、UI素材、代码实现到上线，一个人就能搞定</description>
    </item>
    <item>
      <title>EP00 - DeepSeek R1 本地部署实战 (Mac篇)</title>
      <link>https://blog.hankmo.com/posts/ai/local-deploy-deepseek-r1/</link>
      <pubDate>Tue, 03 Feb 2026 00:00:00 +0000</pubDate>
      <guid>https://blog.hankmo.com/posts/ai/local-deploy-deepseek-r1/</guid>
      <description>别被几万块的显卡劝退。你的 MacBook Pro (Apple Silicon) 就是跑 DeepSeek R1 的神器。本文手把手教你用 Ollama 在本地跑起“满血版”推理模型，不仅免费，而且隐私绝对安全。</description>
    </item>
  </channel>
</rss>
