<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/">
    <channel>
        <title>Tony‘s BLOG</title>
        <link>https://dundun0504.com/</link>
        <description>人类的赞歌是勇气的赞歌</description>
        <lastBuildDate>Mon, 30 Mar 2026 09:31:08 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>https://github.com/jpmonette/feed</generator>
        <language>zh-CN</language>
        <copyright>All rights reserved 2026, 吨吨吨吨吨</copyright>
        <item>
            <title><![CDATA[数据分析笔试准备]]></title>
            <link>https://dundun0504.com/article/data-analysis-prep</link>
            <guid>https://dundun0504.com/article/data-analysis-prep</guid>
            <pubDate>Sat, 28 Mar 2026 00:00:00 GMT</pubDate>
            <content:encoded><![CDATA[<div id="notion-article" class="mx-auto overflow-hidden "><main class="notion light-mode notion-page notion-block-332670e5549980ad8ff7c2b74af0ea8b"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><details class="notion-toggle notion-block-332670e5549980da83e5fba5842f8097"><summary><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-332670e5549980da83e5fba5842f8097" data-id="332670e5549980da83e5fba5842f8097"><span><div id="332670e5549980da83e5fba5842f8097" class="notion-header-anchor"></div><span class="notion-h-title">SQL</span></span></h2></summary><div><details class="notion-toggle notion-block-332670e554998055a896e2120d0846d8"><summary><h4 class="notion-h notion-h3 notion-block-332670e554998055a896e2120d0846d8" data-id="332670e554998055a896e2120d0846d8"><span><div id="332670e554998055a896e2120d0846d8" class="notion-header-anchor"></div><span class="notion-h-title">SELECT DISTINCT</span></span></h4></summary><div><div class="notion-text notion-block-332670e5549980c0ac81fe8834b9553d">SELECT DISTINCT 语句用于返回唯一不同的值。</div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e554998033be8dec43e4f2e6a8"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:417.9829406738281px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Afa627ad7-2fbe-4297-a01e-b8d49f98d0a1%3Aimage.png?table=block&amp;id=332670e5-5499-8033-be8d-ec43e4f2e6a8&amp;t=332670e5-5499-8033-be8d-ec43e4f2e6a8" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e554998002a222f3435b4e3716"><summary><h4 class="notion-h notion-h3 notion-block-332670e554998002a222f3435b4e3716" data-id="332670e554998002a222f3435b4e3716"><span><div id="332670e554998002a222f3435b4e3716" class="notion-header-anchor"></div><span class="notion-h-title">WHERE</span></span></h4></summary><div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e554998034a1c1f2643f4f67bf"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:440.9801025390625px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A7e4f32bf-71ec-4a7f-b02a-d9fa552fd32f%3Aimage.png?table=block&amp;id=332670e5-5499-8034-a1c1-f2643f4f67bf&amp;t=332670e5-5499-8034-a1c1-f2643f4f67bf" alt="notion image" loading="lazy" decoding="async"/></div></figure><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e5549980229e6df53ac50b8c11"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:507.9829406738281px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A927f0261-83fe-48c4-bce8-5ef7d0944ec9%3Aimage.png?table=block&amp;id=332670e5-5499-8022-9e6d-f53ac50b8c11&amp;t=332670e5-5499-8022-9e6d-f53ac50b8c11" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e5549980249f23f61436c6be7b"><summary><h4 class="notion-h notion-h3 notion-block-332670e5549980249f23f61436c6be7b" data-id="332670e5549980249f23f61436c6be7b"><span><div id="332670e5549980249f23f61436c6be7b" class="notion-header-anchor"></div><span class="notion-h-title">AND &amp; OR</span></span></h4></summary><div><div class="notion-text notion-block-332670e55499809598e5cd696ecb1b86">如果第一个条件和第二个条件都成立，则 AND 运算符显示一条记录。</div><div class="notion-text notion-block-332670e5549980338192c611731cd079">如果第一个条件和第二个条件中只要有一个成立，则 OR 运算符显示一条记录。</div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e554998010b652ca56b8df9c35"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:343.991455078125px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A217b6d2e-ef91-4018-a42b-62dd70f41d41%3Aimage.png?table=block&amp;id=332670e5-5499-8010-b652-ca56b8df9c35&amp;t=332670e5-5499-8010-b652-ca56b8df9c35" alt="notion image" loading="lazy" decoding="async"/></div></figure><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e5549980bcb358c30f803b9254"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:440.9801025390625px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A7db502d5-b2c0-4ab5-bdab-9b613c79c6a6%3Aimage.png?table=block&amp;id=332670e5-5499-80bc-b358-c30f803b9254&amp;t=332670e5-5499-80bc-b358-c30f803b9254" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e55499800b9a60d620fcd8d294"><summary><h4 class="notion-h notion-h3 notion-block-332670e55499800b9a60d620fcd8d294" data-id="332670e55499800b9a60d620fcd8d294"><span><div id="332670e55499800b9a60d620fcd8d294" class="notion-header-anchor"></div><span class="notion-h-title">ORDER BY</span></span></h4></summary><div><div class="notion-text notion-block-332670e5549980bb88b3fa34e616ee63">ORDER BY 关键字用于对结果集按照一个列或者多个列进行排序。</div><div class="notion-text notion-block-332670e5549980aeaf8cc3e71be3506c">ORDER BY 关键字默认按照升序对记录进行排序。如果需要按照降序对记录进行排序，您可以使用 DESC 关键字。</div><div class="notion-blank notion-block-332670e55499802a96cce9ff078daf01"> </div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e5549980329788f46a1c3249c6"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:422.9829406738281px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A9f67083c-a3d6-4441-a7af-dcf3ec4f75f9%3Aimage.png?table=block&amp;id=332670e5-5499-8032-9788-f46a1c3249c6&amp;t=332670e5-5499-8032-9788-f46a1c3249c6" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e554998006b05ff3c7344bd6ba"><summary><h4 class="notion-h notion-h3 notion-block-332670e554998006b05ff3c7344bd6ba" data-id="332670e554998006b05ff3c7344bd6ba"><span><div id="332670e554998006b05ff3c7344bd6ba" class="notion-header-anchor"></div><span class="notion-h-title">INSERT INTO</span></span></h4></summary><div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e554998029a540c9a2924b3dfa"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:558.9772338867188px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A8bdbea7c-18cd-4dab-a0c7-55b7ac31f726%3Aimage.png?table=block&amp;id=332670e5-5499-8029-a540-c9a2924b3dfa&amp;t=332670e5-5499-8029-a540-c9a2924b3dfa" alt="notion image" loading="lazy" decoding="async"/></div></figure><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e55499806daf64d5c0a250d5c1"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:562.98291015625px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Ab030607b-013b-426f-88df-67d4f06be5e3%3Aimage.png?table=block&amp;id=332670e5-5499-806d-af64-d5c0a250d5c1&amp;t=332670e5-5499-806d-af64-d5c0a250d5c1" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e5549980c5853acfbdef1e5746"><summary><h4 class="notion-h notion-h3 notion-block-332670e5549980c5853acfbdef1e5746" data-id="332670e5549980c5853acfbdef1e5746"><span><div id="332670e5549980c5853acfbdef1e5746" class="notion-header-anchor"></div><span class="notion-h-title">UPDATE</span></span></h4></summary><div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e55499801a95c3f633ff44d87c"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:430.9801025390625px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A5aa57446-e688-47f3-806a-53c505025192%3Aimage.png?table=block&amp;id=332670e5-5499-801a-95c3-f633ff44d87c&amp;t=332670e5-5499-801a-95c3-f633ff44d87c" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e5549980df844cf8b0635d85e4"><summary><h4 class="notion-h notion-h3 notion-block-332670e5549980df844cf8b0635d85e4" data-id="332670e5549980df844cf8b0635d85e4"><span><div id="332670e5549980df844cf8b0635d85e4" class="notion-header-anchor"></div><span class="notion-h-title">DELETE</span></span></h4></summary><div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e5549980cfacc4f37d94fe8135"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:478px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Aa0de78d8-b629-4746-88b0-26c173f3edc1%3Aimage.png?table=block&amp;id=332670e5-5499-80cf-acc4-f37d94fe8135&amp;t=332670e5-5499-80cf-acc4-f37d94fe8135" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e5549980169b98cd56f7d1008c"><summary><h4 class="notion-h notion-h3 notion-block-332670e5549980169b98cd56f7d1008c" data-id="332670e5549980169b98cd56f7d1008c"><span><div id="332670e5549980169b98cd56f7d1008c" class="notion-header-anchor"></div><span class="notion-h-title">SELECT TOP</span></span></h4></summary><div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e554998036bbeae0e29e8e9bcc"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:228.99147033691406px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Aa4580cf5-10ad-41ac-a250-b16c0ae08327%3Aimage.png?table=block&amp;id=332670e5-5499-8036-bbea-e0e29e8e9bcc&amp;t=332670e5-5499-8036-bbea-e0e29e8e9bcc" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e5549980a49fe7f8896e7e5665"><summary><h4 class="notion-h notion-h3 notion-block-332670e5549980a49fe7f8896e7e5665" data-id="332670e5549980a49fe7f8896e7e5665"><span><div id="332670e5549980a49fe7f8896e7e5665" class="notion-header-anchor"></div><span class="notion-h-title">LIKE</span></span></h4></summary><div><div class="notion-text notion-block-332670e554998074ba63e5c5fbcd9078">pattern需要使用regex，通配符</div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e5549980e68aecf7c015d8c7b1"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:459.98577880859375px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A2f80b7fb-fcf2-4a27-8913-9e60c6da218a%3Aimage.png?table=block&amp;id=332670e5-5499-80e6-8aec-f7c015d8c7b1&amp;t=332670e5-5499-80e6-8aec-f7c015d8c7b1" alt="notion image" loading="lazy" decoding="async"/></div></figure><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e5549980c8b9bdf84e4d66105f"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:474.98577880859375px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Af61e65c7-3fb8-46b3-b0a4-a2688b027631%3Aimage.png?table=block&amp;id=332670e5-5499-80c8-b9bd-f84e4d66105f&amp;t=332670e5-5499-80c8-b9bd-f84e4d66105f" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e5549980268752c725c2d35303"><summary><h4 class="notion-h notion-h3 notion-block-332670e5549980268752c725c2d35303" data-id="332670e5549980268752c725c2d35303"><span><div id="332670e5549980268752c725c2d35303" class="notion-header-anchor"></div><span class="notion-h-title">IN</span></span></h4></summary><div></div></details><details class="notion-toggle notion-block-332670e55499807dbea2ed8a235a6480"><summary><h4 class="notion-h notion-h3 notion-block-332670e55499807dbea2ed8a235a6480" data-id="332670e55499807dbea2ed8a235a6480"><span><div id="332670e55499807dbea2ed8a235a6480" class="notion-header-anchor"></div><span class="notion-h-title">BETWEEN </span></span></h4></summary><div><h5 class="notion-h notion-h4 notion-block-332670e55499805aa13eebc557271218" data-id="332670e55499805aa13eebc557271218"><span><div id="332670e55499805aa13eebc557271218" class="notion-header-anchor"></div><a class="notion-hash-link" href="#332670e55499805aa13eebc557271218" title="NOT BETWEEN"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">NOT BETWEEN</span></span></h5><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e55499807a92cbdd004b044179"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:445.9801025390625px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A6aff9d49-3f34-4717-aec7-498f0592b858%3Aimage.png?table=block&amp;id=332670e5-5499-807a-92cb-dd004b044179&amp;t=332670e5-5499-807a-92cb-dd004b044179" alt="notion image" loading="lazy" decoding="async"/></div></figure><h5 class="notion-h notion-h4 notion-block-332670e554998042b6cdc655badd6aac" data-id="332670e554998042b6cdc655badd6aac"><span><div id="332670e554998042b6cdc655badd6aac" class="notion-header-anchor"></div><a class="notion-hash-link" href="#332670e554998042b6cdc655badd6aac" title="带文本值的BETWEEN"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">带文本值的BETWEEN</span></span></h5><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e5549980f9bcb6d98cbc1af6e4"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:436.97442626953125px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Ad809297c-0a41-4dbe-8fc3-c9fd69f6d5b4%3Aimage.png?table=block&amp;id=332670e5-5499-80f9-bcb6-d98cbc1af6e4&amp;t=332670e5-5499-80f9-bcb6-d98cbc1af6e4" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e554998036bd04d5f8b4560b65"><summary><h4 class="notion-h notion-h3 notion-block-332670e554998036bd04d5f8b4560b65" data-id="332670e554998036bd04d5f8b4560b65"><span><div id="332670e554998036bd04d5f8b4560b65" class="notion-header-anchor"></div><span class="notion-h-title">别名</span></span></h4></summary><div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e5549980fbae7bcc5a5a4f1424"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A92d71fb6-be28-47f9-bd5f-121e7bcdf4e3%3Aimage.png?table=block&amp;id=332670e5-5499-80fb-ae7b-cc5a5a4f1424&amp;t=332670e5-5499-80fb-ae7b-cc5a5a4f1424" alt="notion image" loading="lazy" decoding="async"/></div></figure><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e554998041a6d6ec0352d7d585"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:432px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Ae9c4f1d6-1b53-47de-bf2e-1f6484d8a676%3Aimage.png?table=block&amp;id=332670e5-5499-8041-a6d6-ec0352d7d585&amp;t=332670e5-5499-8041-a6d6-ec0352d7d585" alt="notion image" loading="lazy" decoding="async"/></div></figure><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e554998042a3a9e5eadff8d9bd"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:316.9886169433594px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A1d66122a-5aea-4f30-a45b-2cd8c909847c%3Aimage.png?table=block&amp;id=332670e5-5499-8042-a3a9-e5eadff8d9bd&amp;t=332670e5-5499-8042-a3a9-e5eadff8d9bd" alt="notion image" loading="lazy" decoding="async"/></div></figure><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e5549980498699d5e96253b51b"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:550.9658813476562px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Aec988313-1410-4c9a-ad9c-d918e60c6075%3Aimage.png?table=block&amp;id=332670e5-5499-8049-8699-d5e96253b51b&amp;t=332670e5-5499-8049-8699-d5e96253b51b" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e55499802fbc54c87be9761de0"><summary><h4 class="notion-h notion-h3 notion-block-332670e55499802fbc54c87be9761de0" data-id="332670e55499802fbc54c87be9761de0"><span><div id="332670e55499802fbc54c87be9761de0" class="notion-header-anchor"></div><span class="notion-h-title">JOIN</span></span></h4></summary><div><details class="notion-toggle notion-block-332670e55499807faa77c803e5cf1e99"><summary><h5 class="notion-h notion-h4 notion-block-332670e55499807faa77c803e5cf1e99" data-id="332670e55499807faa77c803e5cf1e99"><span><div id="332670e55499807faa77c803e5cf1e99" class="notion-header-anchor"></div><span class="notion-h-title">INNER JOIN 返回两个表中满足连接条件的记录（交集）</span></span></h5></summary><div></div></details><details class="notion-toggle notion-block-332670e5549980f090c9f5a21cef8870"><summary><h5 class="notion-h notion-h4 notion-block-332670e5549980f090c9f5a21cef8870" data-id="332670e5549980f090c9f5a21cef8870"><span><div id="332670e5549980f090c9f5a21cef8870" class="notion-header-anchor"></div><span class="notion-h-title">LEFT JOIN 返回左表中的所有记录，即使右表中没有匹配的记录（保留左表)</span></span></h5></summary><div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-332670e5549980cab6c0c1838d96bac3"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Ad54678cc-2993-4a41-a62c-7334fed1d4c8%3Aimage.png?table=block&amp;id=332670e5-5499-80ca-b6c0-c1838d96bac3&amp;t=332670e5-5499-80ca-b6c0-c1838d96bac3" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-332670e554998069b08be2e7c9c594bb"><summary><h5 class="notion-h notion-h4 notion-block-332670e554998069b08be2e7c9c594bb" data-id="332670e554998069b08be2e7c9c594bb"><span><div id="332670e554998069b08be2e7c9c594bb" class="notion-header-anchor"></div><span class="notion-h-title">RIGHT JOIN 返回右表中的所有记录，即使左表中没有匹配的记录（保留右表）</span></span></h5></summary><div></div></details><details class="notion-toggle notion-block-332670e5549980f2bb4aefaafcfa91af"><summary><h5 class="notion-h notion-h4 notion-block-332670e5549980f2bb4aefaafcfa91af" data-id="332670e5549980f2bb4aefaafcfa91af"><span><div id="332670e5549980f2bb4aefaafcfa91af" class="notion-header-anchor"></div><span class="notion-h-title">FULL OUTER JOIN 返回两个表的并集，包含匹配和不匹配的记录</span></span></h5></summary><div></div></details><details class="notion-toggle notion-block-332670e5549980459935e8f27046ffe1"><summary><h5 class="notion-h notion-h4 notion-block-332670e5549980459935e8f27046ffe1" data-id="332670e5549980459935e8f27046ffe1"><span><div id="332670e5549980459935e8f27046ffe1" class="notion-header-anchor"></div><span class="notion-h-title">CROSS JOIN 返回两个表的笛卡尔积，每条左表记录与每条右表记录进行组合</span></span></h5></summary><div></div></details><details class="notion-toggle notion-block-332670e5549980f28d9ede91833743f1"><summary><h5 class="notion-h notion-h4 notion-block-332670e5549980f28d9ede91833743f1" data-id="332670e5549980f28d9ede91833743f1"><span><div id="332670e5549980f28d9ede91833743f1" class="notion-header-anchor"></div><span class="notion-h-title">SELF JOIN 将一个表与自身连接</span></span></h5></summary><div></div></details><details class="notion-toggle notion-block-332670e55499804aa494cbcc6dbaef93"><summary><h5 class="notion-h notion-h4 notion-block-332670e55499804aa494cbcc6dbaef93" data-id="332670e55499804aa494cbcc6dbaef93"><span><div id="332670e55499804aa494cbcc6dbaef93" class="notion-header-anchor"></div><span class="notion-h-title">NATURAL JOIN 基于同名字段自动匹配连接的表</span></span></h5></summary><div></div></details></div></details><details class="notion-toggle notion-block-332670e5549980f3abebef41dfea4b07"><summary><h4 class="notion-h notion-h3 notion-block-332670e5549980f3abebef41dfea4b07" data-id="332670e5549980f3abebef41dfea4b07"><span><div id="332670e5549980f3abebef41dfea4b07" class="notion-header-anchor"></div><span class="notion-h-title">UNION</span></span></h4></summary><div></div></details><details class="notion-toggle notion-block-333670e5549980beb870cd20b927724f"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980beb870cd20b927724f" data-id="333670e5549980beb870cd20b927724f"><span><div id="333670e5549980beb870cd20b927724f" class="notion-header-anchor"></div><span class="notion-h-title">SELECT INTO</span></span></h4></summary><div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-333670e55499804fa35fd5671cad8b39"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:546px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A8104dc05-679f-407f-8bbc-cd77c5f43595%3Aimage.png?table=block&amp;id=333670e5-5499-804f-a35f-d5671cad8b39&amp;t=333670e5-5499-804f-a35f-d5671cad8b39" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-333670e5549980828e10c4dd20ca935f"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980828e10c4dd20ca935f" data-id="333670e5549980828e10c4dd20ca935f"><span><div id="333670e5549980828e10c4dd20ca935f" class="notion-header-anchor"></div><span class="notion-h-title">CREATE DATABASE</span></span></h4></summary><div></div></details><details class="notion-toggle notion-block-333670e5549980e393eacf7bf6808065"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980e393eacf7bf6808065" data-id="333670e5549980e393eacf7bf6808065"><span><div id="333670e5549980e393eacf7bf6808065" class="notion-header-anchor"></div><span class="notion-h-title">CREATE TABLE</span></span></h4></summary><div></div></details><details class="notion-toggle notion-block-333670e5549980559f9fe6ca114174ca"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980559f9fe6ca114174ca" data-id="333670e5549980559f9fe6ca114174ca"><span><div id="333670e5549980559f9fe6ca114174ca" class="notion-header-anchor"></div><span class="notion-h-title">Constraints（约束）</span></span></h4></summary><div><div class="notion-text notion-block-333670e5549980f3950cf5fb193fcba5">看CPSC304教程</div></div></details><details class="notion-toggle notion-block-333670e5549980fda18bd165cecb0899"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980fda18bd165cecb0899" data-id="333670e5549980fda18bd165cecb0899"><span><div id="333670e5549980fda18bd165cecb0899" class="notion-header-anchor"></div><span class="notion-h-title">DEFAULT</span></span></h4></summary><div></div></details><details class="notion-toggle notion-block-333670e5549980599e13e1e1b1a78901"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980599e13e1e1b1a78901" data-id="333670e5549980599e13e1e1b1a78901"><span><div id="333670e5549980599e13e1e1b1a78901" class="notion-header-anchor"></div><span class="notion-h-title">CREATE INDEX</span></span></h4></summary><div><div class="notion-text notion-block-333670e55499805abe50c5046b51c00f">您可以在表中创建索引，以便更加快速高效地查询数据。</div><div class="notion-text notion-block-333670e55499809884e2d97d4b0469ed">用户无法看到索引，它们只能被用来加速搜索/查询。</div><div class="notion-text notion-block-333670e55499804aaf19ce1e129666ec"><b>注释：</b>更新一个包含索引的表需要比更新一个没有索引的表花费更多的时间，这是由于索引本身也需要更新。因此，理想的做法是仅仅在常常被搜索的列（以及表）上面创建索引。</div></div></details><details class="notion-toggle notion-block-333670e55499800b9ab0e1d01682a26a"><summary><h4 class="notion-h notion-h3 notion-block-333670e55499800b9ab0e1d01682a26a" data-id="333670e55499800b9ab0e1d01682a26a"><span><div id="333670e55499800b9ab0e1d01682a26a" class="notion-header-anchor"></div><span class="notion-h-title">DROP</span></span></h4></summary><div></div></details><details class="notion-toggle notion-block-333670e5549980269bf6c8fc2a29ec4c"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980269bf6c8fc2a29ec4c" data-id="333670e5549980269bf6c8fc2a29ec4c"><span><div id="333670e5549980269bf6c8fc2a29ec4c" class="notion-header-anchor"></div><span class="notion-h-title">ALTER</span></span></h4></summary><div><div class="notion-text notion-block-333670e55499809c8345c083088cbe2b">如需在表中添加列，请使用下面的语法:</div><div class="notion-text notion-block-333670e5549980c08fb1d6736eb264ad">如需删除表中的列，请使用下面的语法（请注意，某些数据库系统不允许这种在数据库表中删除列的方式）：</div><div class="notion-blank notion-block-333670e5549980e796e1f79bf679be98"> </div></div></details><details class="notion-toggle notion-block-333670e5549980dfb158ffd84ee395cb"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980dfb158ffd84ee395cb" data-id="333670e5549980dfb158ffd84ee395cb"><span><div id="333670e5549980dfb158ffd84ee395cb" class="notion-header-anchor"></div><span class="notion-h-title">CREATE VIEW（视图）</span></span></h4></summary><div></div></details><details class="notion-toggle notion-block-333670e554998018945aff9af4a85a67"><summary><h4 class="notion-h notion-h3 notion-block-333670e554998018945aff9af4a85a67" data-id="333670e554998018945aff9af4a85a67"><span><div id="333670e554998018945aff9af4a85a67" class="notion-header-anchor"></div><span class="notion-h-title">Date函数</span></span></h4></summary><div><div class="notion-text notion-block-333670e5549980abb39fe1da82abebca"><b>MySQL</b> 使用下列数据类型在数据库中存储日期或日期/时间值：</div><ul class="notion-list notion-list-disc notion-block-333670e5549980a281d9d2fd0e7b3b9d"><li>DATE - 格式：YYYY-MM-DD</li></ul><ul class="notion-list notion-list-disc notion-block-333670e5549980e3a4fedad3ec2fee6b"><li>DATETIME - 格式：YYYY-MM-DD HH:MM:SS</li></ul><ul class="notion-list notion-list-disc notion-block-333670e554998054bea0eb0986f38c40"><li>TIMESTAMP - 格式：YYYY-MM-DD HH:MM:SS</li></ul><ul class="notion-list notion-list-disc notion-block-333670e5549980e1893ed6d3b348db8f"><li>YEAR - 格式：YYYY 或 YY</li></ul><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-333670e554998047b5a7d40c105a78e5"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:100%;max-width:100%;flex-direction:column;height:100%"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Ae8f9dbef-8240-417f-926b-454968212286%3Aimage.png?table=block&amp;id=333670e5-5499-8047-b5a7-d40c105a78e5&amp;t=333670e5-5499-8047-b5a7-d40c105a78e5" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-333670e55499800d9395ee114377cf1f"><summary><h4 class="notion-h notion-h3 notion-block-333670e55499800d9395ee114377cf1f" data-id="333670e55499800d9395ee114377cf1f"><span><div id="333670e55499800d9395ee114377cf1f" class="notion-header-anchor"></div><span class="notion-h-title">NULL</span></span></h4></summary><div></div></details><details class="notion-toggle notion-block-333670e55499800c955dc5c485d3d6e8"><summary><h4 class="notion-h notion-h3 notion-block-333670e55499800c955dc5c485d3d6e8" data-id="333670e55499800c955dc5c485d3d6e8"><span><div id="333670e55499800c955dc5c485d3d6e8" class="notion-header-anchor"></div><span class="notion-h-title">GROUP BY</span></span></h4></summary><div></div></details><details class="notion-toggle notion-block-333670e5549980d4ba4cfed3389912fa"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980d4ba4cfed3389912fa" data-id="333670e5549980d4ba4cfed3389912fa"><span><div id="333670e5549980d4ba4cfed3389912fa" class="notion-header-anchor"></div><span class="notion-h-title">HAVING</span></span></h4></summary><div><div class="notion-text notion-block-333670e55499806cb4ddd25a9ea1f3d4">在 SQL 中增加 HAVING 子句原因是，WHERE 关键字无法与聚合函数一起使用。</div><div class="notion-text notion-block-333670e5549980ae930bcd4f87978fb3">HAVING 子句可以让我们筛选分组后的各组数据。</div></div></details><details class="notion-toggle notion-block-333670e5549980ecb853eff5d489f112"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980ecb853eff5d489f112" data-id="333670e5549980ecb853eff5d489f112"><span><div id="333670e5549980ecb853eff5d489f112" class="notion-header-anchor"></div><span class="notion-h-title">EXISTS</span></span></h4></summary><div><div class="notion-text notion-block-333670e5549980d5b63ef6335ca272b8">EXISTS 运算符用于判断查询子句是否有记录，如果有一条或多条记录存在返回 True，否则返回 False。</div></div></details><details class="notion-toggle notion-block-333670e55499808cac24e7ddb2215a60"><summary><h4 class="notion-h notion-h3 notion-block-333670e55499808cac24e7ddb2215a60" data-id="333670e55499808cac24e7ddb2215a60"><span><div id="333670e55499808cac24e7ddb2215a60" class="notion-header-anchor"></div><span class="notion-h-title">UCASE()</span></span></h4></summary><div><div class="notion-text notion-block-333670e55499800cb9dfcbbafe3ffa3f">UCASE() 函数把字段的值转换为大写。</div></div></details><details class="notion-toggle notion-block-333670e554998048bcb7e214091c29e0"><summary><h4 class="notion-h notion-h3 notion-block-333670e554998048bcb7e214091c29e0" data-id="333670e554998048bcb7e214091c29e0"><span><div id="333670e554998048bcb7e214091c29e0" class="notion-header-anchor"></div><span class="notion-h-title">MID()</span></span></h4></summary><div><div class="notion-text notion-block-333670e5549980fd80c1f35d3d301099">MID() 函数用于从文本字段中提取字符。</div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-333670e5549980d1bec7d627d4be0aba"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:533.991455078125px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A21b94cdf-882b-4a6f-a6fc-527715ed0f71%3Aimage.png?table=block&amp;id=333670e5-5499-80d1-bec7-d627d4be0aba&amp;t=333670e5-5499-80d1-bec7-d627d4be0aba" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-333670e554998085a781d687ea5383cf"><summary><h4 class="notion-h notion-h3 notion-block-333670e554998085a781d687ea5383cf" data-id="333670e554998085a781d687ea5383cf"><span><div id="333670e554998085a781d687ea5383cf" class="notion-header-anchor"></div><span class="notion-h-title">LEN()</span></span></h4></summary><div><div class="notion-text notion-block-333670e5549980d3888af4beb704316e">LEN() 函数返回文本字段中值的长度。</div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-333670e554998058bd29fb6833077183"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:331px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3Aed81004e-997a-4dd8-9302-c35792884cd1%3Aimage.png?table=block&amp;id=333670e5-5499-8058-bd29-fb6833077183&amp;t=333670e5-5499-8058-bd29-fb6833077183" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details><details class="notion-toggle notion-block-333670e5549980469861f35dba2afdf8"><summary><h4 class="notion-h notion-h3 notion-block-333670e5549980469861f35dba2afdf8" data-id="333670e5549980469861f35dba2afdf8"><span><div id="333670e5549980469861f35dba2afdf8" class="notion-header-anchor"></div><span class="notion-h-title">ROUND()</span></span></h4></summary><div><div class="notion-text notion-block-333670e55499803fbe92cd5e497eba83">ROUND() 函数用于把数值字段舍入为指定的小数位数。</div><figure class="notion-asset-wrapper notion-asset-wrapper-image notion-block-333670e5549980339b39c2e2b4ee3561"><div style="position:relative;display:flex;justify-content:center;align-self:center;width:316.9886169433594px;max-width:100%;flex-direction:column"><img style="object-fit:cover" src="https://www.notion.so/image/attachment%3A7e469f0b-8b47-48e3-b3e2-254d3076fdd1%3Aimage.png?table=block&amp;id=333670e5-5499-8033-9b39-c2e2b4ee3561&amp;t=333670e5-5499-8033-9b39-c2e2b4ee3561" alt="notion image" loading="lazy" decoding="async"/></div></figure></div></details></div></details></main></div>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[AI 日报 | 2026-03-29]]></title>
            <link>https://dundun0504.com/article/ai-daily-2026-03-29</link>
            <guid>https://dundun0504.com/article/ai-daily-2026-03-29</guid>
            <pubDate>Sun, 29 Mar 2026 00:00:00 GMT</pubDate>
            <description><![CDATA[2026-03-29 AI日报：Mistral Voxtral TTS发布、Gemini 3.1 Flash Live上线、VideoSeek/UniMotion新论文、NVIDIA Rubin平台 GTC 更新、OpenClaw GitHub爆火，含攀岩动作分析相关视觉论文精选。]]></description>
            <content:encoded><![CDATA[<div id="notion-article" class="mx-auto overflow-hidden "><main class="notion light-mode notion-page notion-block-332670e55499812cb824fe1686f3d060"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-40cd6f9f7cf74895900894e0fca38814" data-id="40cd6f9f7cf74895900894e0fca38814"><span><div id="40cd6f9f7cf74895900894e0fca38814" class="notion-header-anchor"></div><a class="notion-hash-link" href="#40cd6f9f7cf74895900894e0fca38814" title="一、今日最重要的 5 条"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">一、今日最重要的 5 条</span></span></h2><div class="notion-text notion-block-1dd9b8f8b51b4fd08b1775155dcde6c9"><b>① Mistral Voxtral TTS 发布（2026-03-26）</b></div><div class="notion-text notion-block-416642f08c874e8487de5600dd006571">轻量开源 TTS 模型，支持 9 种语言（英法德西葡意荷印阿），专为边缘设备设计（智能手表、手机）。Apache 2.0 开源，可本地部署。对于攀岩 app 的语音反馈模块有直接落地价值。</div><div class="notion-text notion-block-3112b19ac001435da41df0ee2aefb912">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://mistral.ai/news">Mistral News</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://techcrunch.com/2026/03/26/mistral-releases-a-new-open-source-model-for-speech-generation/">TechCrunch 报道</a></div><div class="notion-text notion-block-3435944320ce4c74bd44514dce57a7d6"><b>② Gemini 3.1 Flash Live 上线（2026-03-26）</b></div><div class="notion-text notion-block-ef1c395e58074cbe9700284880b44bbf">Google DeepMind 发布实时多模态对话模型：原生音频输入/输出、128K token 上下文，支持音频 / 图像 / 视频 / 文本流。定价极低，适合实时视频分析 pipeline。对「上传视频 → 实时分析」类应用是重大利好。</div><div class="notion-text notion-block-9e71dd44c8814558b691f5115c5747d6">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://deepmind.google/models/model-cards/gemini-3-1-flash-live/">Model Card</a></div><div class="notion-text notion-block-4ca393e25f0345ae8107774b7c23c42d"><b>③ VideoSeek：长视频 Agent 框架（arXiv 最新）</b></div><div class="notion-text notion-block-1fcc7b21ecb74981a477c1b68cf60b35">提出「视频逻辑流」引导的 agent，用 think-act-observe 循环 + 多粒度视频 toolkit，大幅减少需要处理的帧数，同时提升长视频问答准确率。对攀岩动作分析 app 的视频检索/片段定位模块高度相关。</div><div class="notion-text notion-block-914b81b1ecf24a2e8ea00a6dec78167e">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://papers.cool/arxiv/2603.20185">papers.cool/arxiv/2603.20185</a></div><div class="notion-text notion-block-fc1527c4909f42079017be203d350cd7"><b>④ UniMotion：统一运动理解与生成框架（arXiv 最新）</b></div><div class="notion-text notion-block-59d6698679084e188926789f0bb9ae31">首个同时支持人体运动 / 自然语言 / RGB 图像「理解 + 生成」的统一框架，提出 Cross-Modal Aligned Motion VAE（CMA-VAE），将运动作为第一类连续 modality。对动作识别 + 动作生成反馈的研究方向极具参考价值。</div><div class="notion-text notion-block-7eec97108b804e6eaf3ccd046fc436d5">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/list/cs.AI/current">arXiv </a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://cs.CV/cs.AI">cs.CV/cs.AI</a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/list/cs.AI/current"> 最新列表</a></div><div class="notion-text notion-block-453dbfe0c09141f692dea4961df2aa2f"><b>⑤ NVIDIA Rubin 平台 + GTC 2026 基础设施公告</b></div><div class="notion-text notion-block-40cec8decaed4921bcdbc2926c1ed679">发布 Rubin 架构（6 颗新芯片），Microsoft + NVIDIA 在 Azure 部署数十万张液冷 Grace Blackwell GPU，Fairwater AI Superfactories 基于 NVL72。推理成本将继续下降，直接影响对 inference-heavy 视频分析应用的商业可行性。</div><div class="notion-text notion-block-078c1adb0e27491f9b4ad07f97e1038f">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://nvidianews.nvidia.com/news/rubin-platform-ai-supercomputer">NVIDIA Rubin Platform</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blogs.nvidia.com/blog/gtc-2026-news/">GTC 2026 Blog</a></div><hr class="notion-hr notion-block-9585543939944b9bbf940fe88e11cee0"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-5c903d99c44c4414b84dd708b9da5b15" data-id="5c903d99c44c4414b84dd708b9da5b15"><span><div id="5c903d99c44c4414b84dd708b9da5b15" class="notion-header-anchor"></div><a class="notion-hash-link" href="#5c903d99c44c4414b84dd708b9da5b15" title="二、按目标分类"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">二、按目标分类</span></span></h2><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-327116826c04439fb7004ab5ada614a6" data-id="327116826c04439fb7004ab5ada614a6"><span><div id="327116826c04439fb7004ab5ada614a6" class="notion-header-anchor"></div><a class="notion-hash-link" href="#327116826c04439fb7004ab5ada614a6" title="A. 前沿模型 / 一手发布"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">A. 前沿模型 / 一手发布</span></span></h3><div class="notion-text notion-block-c036da14a5d54e96870ce17ac9251bf7"><b>Gemini 3.1 Flash Live（2026-03-26）</b></div><ul class="notion-list notion-list-disc notion-block-2f4c10752b8f4b83b0c8499b35657735"><li>事件：Google DeepMind 正式发布 Gemini 3.1 Flash Live</li></ul><ul class="notion-list notion-list-disc notion-block-0f2ea36df4e8415d940f0b8b766ffdf3"><li>核心内容：原生音频 I/O、128K context、实时流式多模态（audio/image/video/text）。价格极低，$0.25/M input tokens（Flash-Lite 定价级别）</li></ul><ul class="notion-list notion-list-disc notion-block-455ce7748483473eb86e18a44bb41b43"><li>为什么重要：首个在价格和实时性上都可用的多模态流式模型，对视频分析 pipeline 是 game changer</li></ul><ul class="notion-list notion-list-disc notion-block-2f98514e907f4b3d987eac369d741d05"><li>我需不需要点开：<b>需要</b>，尤其关注 video stream 输入 API 文档</li></ul><ul class="notion-list notion-list-disc notion-block-4664c33b52b04ad9aa58aa4040521877"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://deepmind.google/models/model-cards/gemini-3-1-flash-live">deepmind.google/models/model-cards/gemini-3-1-flash-live</a></li></ul><div class="notion-text notion-block-3537a70df72c4fdbb727012c71bed6a1"><b>Mistral Small 4（2026-03-03）+ Voxtral TTS（2026-03-26）</b></div><ul class="notion-list notion-list-disc notion-block-8e9f7ae7eedf431592e50af48394dba6"><li>事件：Mistral Small 4（22B，Apache 2.0）+ Voxtral 轻量 TTS 双发</li></ul><ul class="notion-list notion-list-disc notion-block-ce0909d82af84176805731427c7c4d73"><li>核心内容：Small 4 在推理/指令遵循上超越 3-5× 大的模型；Voxtral 支持 9 语言，可跑在智能手表上</li></ul><ul class="notion-list notion-list-disc notion-block-fa03aeb24ced460997446131c83eded4"><li>为什么重要：两个高质量开源模型，一个降低本地推理成本，一个打开边缘语音合成</li></ul><ul class="notion-list notion-list-disc notion-block-6885be995ff44ec7a4453a629d2963b9"><li>我需不需要点开：<b>需要</b>，Voxtral 直接可用于攀岩 app 语音反馈</li></ul><ul class="notion-list notion-list-disc notion-block-a29046280fbe40feb7c4e037894e31f6"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://mistral.ai/news">mistral.ai/news</a></li></ul><div class="notion-text notion-block-7f8d6cf31b894b3999157add10b2fbdd"><b>Claude Opus 4.6 + Computer Use（Anthropic，2026-02-05 / 03-23）</b></div><ul class="notion-list notion-list-disc notion-block-2f7535458618451990488e2c4440fcd7"><li>事件：Opus 4.6 达 80.8% SWE-Bench Verified；Computer Use 进入 Pro/Max research preview</li></ul><ul class="notion-list notion-list-disc notion-block-70e28285e1684b2a8d536921ce4bd6e7"><li>核心内容：14.5 小时任务持续能力；Computer Use 可在 Mac 上点击/输入/导航真实应用</li></ul><ul class="notion-list notion-list-disc notion-block-23fe590261da4dac807b0b08c4760d94"><li>为什么重要：agentic 能力边界大幅扩展，coding agent 流水线进入新阶段</li></ul><ul class="notion-list notion-list-disc notion-block-e394a71fbeec4c828d2199ce4c46e439"><li>我需不需要点开：了解即可，重点关注 Computer Use API 何时 GA</li></ul><ul class="notion-list notion-list-disc notion-block-dc1509d4b66b4f86826528c5d2f8d535"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.anthropic.com/news">Anthropic News</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://thenewstack.io/anthropic-march-2026-roundup/">The New Stack 3月汇总</a></li></ul><div class="notion-text notion-block-cceb0a62c69d42d9923e526b6c56b47a"><b>GPT-5.4（OpenAI，2026-03-05）</b></div><ul class="notion-list notion-list-disc notion-block-1212a1fbdb134dbda30b5ee9ee23715c"><li>事件：GPT-5.4 Standard / Thinking / Pro 三档发布，1M context 窗口，首个集成 computer use 的 mainline 推理模型</li></ul><ul class="notion-list notion-list-disc notion-block-0ca3b79d9f5044398cc5026987759b4f"><li>核心内容：OSWorld-V benchmark 75%（真实桌面生产力任务）；比 GPT-5.2 减少 33% 事实错误</li></ul><ul class="notion-list notion-list-disc notion-block-ddbc1316f045480fa5373eb595d7a303"><li>为什么重要：coding + agent + computer use 三合一，工程侧可落地的 agentic 基础设施</li></ul><ul class="notion-list notion-list-disc notion-block-8d4d4bea88204314a70ae0f47afb3c12"><li>我需不需要点开：了解即可，实用性评测等社区反馈</li></ul><ul class="notion-list notion-list-disc notion-block-b93fc32d0ffa4b65b9584ab51cea4f27"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://openai.com/index/introducing-gpt-5-4">openai.com/index/introducing-gpt-5-4</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://techcrunch.com/2026/03/05/openai-launches-gpt-5-4-with-pro-and-thinking-versions/">TechCrunch</a></li></ul><div class="notion-text notion-block-8a229e66e0c4491a9aeb1db45b948cc8"><b>MinerU2.5（HuggingFace Papers，近期）</b></div><ul class="notion-list notion-list-disc notion-block-871dcd625cdb4384b9af635c13f486c3"><li>事件：1.2B 参数文档解析 Vision-Language 模型，SOTA 识别精度</li></ul><ul class="notion-list notion-list-disc notion-block-9b4a0fc29a8b454d994ffda8122e3060"><li>核心内容：专攻复杂文档（表格、公式、多栏版面）解析，可直接用于 RAG pipeline 的文档预处理</li></ul><ul class="notion-list notion-list-disc notion-block-582d20fe2be24d65b6fb9bfae9ee1056"><li>为什么重要：轻量、开源、文档解析质量好，降低 RAG 数据准备成本</li></ul><ul class="notion-list notion-list-disc notion-block-28999c8ef7bc48cf8b8dcb4b8b35e08c"><li>我需不需要点开：中等优先，RAG 项目时再深看</li></ul><ul class="notion-list notion-list-disc notion-block-aa3ae0f92a5d42b29f273e0fe04a32da"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://huggingface.co/papers/trending">huggingface.co/papers/trending</a></li></ul><hr class="notion-hr notion-block-b109b7d85b0c4dd8b8a9f0ea8c431c44"/><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-5a0e0b165f044d2198d2e2d3bb066b82" data-id="5a0e0b165f044d2198d2e2d3bb066b82"><span><div id="5a0e0b165f044d2198d2e2d3bb066b82" class="notion-header-anchor"></div><a class="notion-hash-link" href="#5a0e0b165f044d2198d2e2d3bb066b82" title="B. AI 工程 / Agent / Coding Workflow"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">B. AI 工程 / Agent / Coding Workflow</span></span></h3><div class="notion-text notion-block-a103999736e24ff1ba6d5ae07e902bda"><b>Memory Sparse Attention（MSA）（2026-03-26，arXiv）</b></div><ul class="notion-list notion-list-disc notion-block-a44d84cc55904bc49b2c04004d8aae44"><li>内容：线性复杂度注意力机制，使 LLM 能够高效处理超长上下文（远超 1M token），无需二次方内存</li></ul><ul class="notion-list notion-list-disc notion-block-c1b7974a2bdb412da735aafc5cbe4185"><li>可落地价值：长视频分析、超长文档 RAG、长代码库理解，降低推理成本</li></ul><ul class="notion-list notion-list-disc notion-block-4be5b5acc85c40f79139d1e36731939c"><li>对我当前开发/学习的意义：若做视频长序列分析，这是关键技术储备；可用于面试表达「我了解 sparse attention 在长上下文中的应用」</li></ul><ul class="notion-list notion-list-disc notion-block-8aeeb5dab3a14ac2825700e5735cb09d"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/papers/date/2026-03-26">HuggingFace Papers - March 26</a></li></ul><div class="notion-text notion-block-9772f36cf7ec40f9aeb4a0874936144b"><b>Model Context Protocol（MCP）97M 安装量里程碑</b></div><ul class="notion-list notion-list-disc notion-block-8501d1f4d4e64dbbaf6f06abbb4d7a23"><li>内容：MCP 已突破 9700 万次安装，所有主流 AI 厂商均已发布 MCP 兼容工具链，从「实验性」转为「agentic 基础设施标准」</li></ul><ul class="notion-list notion-list-disc notion-block-efc55f78f3974fc2b4e709fee098b085"><li>可落地价值：开发 agent 时应默认支持 MCP，接入生态工具的成本极低</li></ul><ul class="notion-list notion-list-disc notion-block-fa7f7c3e8d5a451687e40f99d97bae56"><li>对我当前开发/学习的意义：简历/面试中应提到「熟悉 MCP 协议」；做 side project 时优先考虑 MCP 接口</li></ul><ul class="notion-list notion-list-disc notion-block-fcb38e403ae0423b8c175832a461b6d9"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.mean.ceo/ai-product-launches-news-march-2026/">AI Product Launches Blog</a></li></ul><div class="notion-text notion-block-51191d7df3e14562927d84ab4a48e1a8"><b>ARC-AGI-3 Benchmark（近期）</b></div><ul class="notion-list notion-list-disc notion-block-0f3585b0b01443b9a5fe0ee4a70a335c"><li>内容：新一代交互式 agentic 智能 benchmark；前沿系统得分 &lt;1%，人类得分 100%</li></ul><ul class="notion-list notion-list-disc notion-block-0e964e16a8614ebbaabddc05a23dab8b"><li>可落地价值：了解 agentic 能力评估的最新标准；可用于评估自己 agent 项目的能力边界</li></ul><ul class="notion-list notion-list-disc notion-block-14ca2164985548b1aba9d3a68452d9b7"><li>对我当前开发/学习的意义：好的面试话题；说明当前 agent 距真正 AGI 还有巨大 gap</li></ul><ul class="notion-list notion-list-disc notion-block-4c0e25e8313145e7aa602408ebcc7276"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/list/cs.AI/current">arXiv </a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://cs.AI">cs.AI</a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/list/cs.AI/current"> current</a></li></ul><div class="notion-text notion-block-2cbfc21e6f894aeab595035c668c1a62"><b>OpenClaw（GitHub 爆炸式增长）</b></div><ul class="notion-list notion-list-disc notion-block-fad83adb9bdc46c59005d7559af85574"><li>内容：LLM 与计算机之间的中间层 agent；用「skills」执行 shell、浏览器、API 任务；集成 WhatsApp/Telegram/Slack/Discord；335K+ stars（60 天内超越 React）</li></ul><ul class="notion-list notion-list-disc notion-block-5aeed3fc672646aab2898d014836bd75"><li>可落地价值：快速搭建本地 agentic workflow 的脚手架，不需要自己写 computer use</li></ul><ul class="notion-list notion-list-disc notion-block-c05112e42f1d433e8b27dd8330e60a7c"><li>对我当前开发/学习的意义：值得 fork 研究架构；agent 项目可以以此为基础组件</li></ul><ul class="notion-list notion-list-disc notion-block-5719fa04e546429cb31971945230b728"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.kdnuggets.com/openclaw-explained-the-free-ai-agent-tool-going-viral-already-in-2026">KDnuggets - OpenClaw Explained</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://thenewstack.io/openclaw-github-stars-security/">The New Stack</a></li></ul><hr class="notion-hr notion-block-f791fa4ba2af4c41b8975e21fc530bf9"/><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-421cfe66bcd741bd9ab9b2f441586d75" data-id="421cfe66bcd741bd9ab9b2f441586d75"><span><div id="421cfe66bcd741bd9ab9b2f441586d75" class="notion-header-anchor"></div><a class="notion-hash-link" href="#421cfe66bcd741bd9ab9b2f441586d75" title="C. 视觉 / 视频 / 运动人体分析"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">C. 视觉 / 视频 / 运动人体分析</span></span></h3><div class="notion-text notion-block-9ca2a3405746411482bc5fd7bad0f72d"><b>VideoSeek：长视频 Agent（arXiv 最新）</b></div><ul class="notion-list notion-list-disc notion-block-1adc29dbc2654b0a940e78922c1d344f"><li>内容：query-aware 长视频探索 agent，think-act-observe 循环 + 多粒度 toolkit，大幅减少处理帧数同时提升准确率</li></ul><ul class="notion-list notion-list-disc notion-block-2b6d828e492846559de2e97272d7cd4d"><li>与「攀岩动作分析 app」的相关性：<b>高度相关</b> — 攀岩视频往往 3-15 分钟，需要精准定位关键动作片段，VideoSeek 框架直接可用于「上传视频 → 定位关键动作帧 → 分析」</li></ul><ul class="notion-list notion-list-disc notion-block-68b4024ba03149aa8fa7e997bf3e1b02"><li>可迁移到项目的点：借鉴其「视频逻辑流」设计，用于攀岩动作片段的自动切割和时序标注</li></ul><ul class="notion-list notion-list-disc notion-block-6bc401c7b2134c4abd92174c427b4a27"><li>优先级：<b>高</b></li></ul><ul class="notion-list notion-list-disc notion-block-6bd99e47996149ba9c9d513eac565abc"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://papers.cool/arxiv/2603.20185">papers.cool/arxiv/2603.20185</a></li></ul><div class="notion-text notion-block-7c5e3013c06f41debe0e7cf81368f7c1"><b>UniMotion：统一运动理解与生成（arXiv 最新）</b></div><ul class="notion-list notion-list-disc notion-block-4d64bff9659540a6b738f2662f355c99"><li>内容：首个支持人体运动 / 自然语言 / RGB 图像「理解 + 生成」统一框架，Cross-Modal Aligned Motion VAE（CMA-VAE）</li></ul><ul class="notion-list notion-list-disc notion-block-46135024129a458d861062f237271ffb"><li>与攀岩动作分析 app 的相关性：<b>高度相关</b> — 可实现「视频 → 动作理解 → 语言描述 → 动作改进建议」完整链路</li></ul><ul class="notion-list notion-list-disc notion-block-ccade959432f4161af1ec28c355ff87c"><li>可迁移到项目的点：motion-language alignment 方法；将攀岩动作表示为连续 motion token 后与语言对齐，生成改进建议</li></ul><ul class="notion-list notion-list-disc notion-block-51d2c277ca5343fb9a5fed94ed1516c0"><li>优先级：<b>高</b></li></ul><ul class="notion-list notion-list-disc notion-block-efa9cfab250348a9b5a1c1b4e34437e9"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/list/cs.AI/current">arXiv </a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://cs.CV/cs.AI">cs.CV/cs.AI</a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/list/cs.AI/current"> current</a></li></ul><div class="notion-text notion-block-952b0d396be24f48933bb69216457c72"><b>WildWorld：动作条件世界模型数据集（2026-03-24，HuggingFace）</b></div><ul class="notion-list notion-list-disc notion-block-9b3304d1c29e48659c9f51c35c5c5858"><li>内容：大规模动作条件世界建模数据集，包含来自真实感游戏的显式状态标注，支持物理世界动作预测</li></ul><ul class="notion-list notion-list-disc notion-block-194dcd059c5f49d7b5fb86940bdb31dd"><li>与攀岩动作分析 app 的相关性：<b>中等</b> — 数据集范式（显式状态标注 + 动作条件）对构建攀岩动作数据集有方法论参考价值</li></ul><ul class="notion-list notion-list-disc notion-block-4db4fcf3ba3549309cef33eb26807891"><li>可迁移到项目的点：参考标注范式，设计自己的攀岩动作数据集结构</li></ul><ul class="notion-list notion-list-disc notion-block-1eecfac3c8584aa1bac8d1d5fcfb9597"><li>优先级：<b>中</b></li></ul><ul class="notion-list notion-list-disc notion-block-afbdf54d4eaa4d9687e585dafba44d6a"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/papers/trending">HuggingFace Papers</a></li></ul><div class="notion-text notion-block-4573d88e97ad47b088ee3ef326c95a87"><b>VideoDetective：长视频问答（arXiv 最新）</b></div><ul class="notion-list notion-list-disc notion-block-769ef181aff14a3c9b849e042cc03fb9"><li>内容：结合 query-to-segment 相关性 + 跨片段亲和度的长视频问答框架，有效的「线索寻找」机制</li></ul><ul class="notion-list notion-list-disc notion-block-dc8b9cb9cc804f8e923ad635f9dc985b"><li>与攀岩动作分析 app 的相关性：<b>中等</b> — 适用于「这段视频里运动员什么时候完成了 flag 动作」这类 QA 任务</li></ul><ul class="notion-list notion-list-disc notion-block-0119059164a6410f90c87fa7ecbb1645"><li>可迁移到项目的点：视频 QA 的 segment relevance 机制可用于攀岩动作检索</li></ul><ul class="notion-list notion-list-disc notion-block-48ffa0683472456084e8936395e3bd3a"><li>优先级：<b>中</b></li></ul><ul class="notion-list notion-list-disc notion-block-c3d4cc1a034a4e4e8217ee749852ca93"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/list/cs.CV/recent">arXiv </a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://cs.CV">cs.CV</a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/list/cs.CV/recent"> recent</a></li></ul><div class="notion-text notion-block-cbb4cdf105384403ad2489c08b7cdc5c"><b>Gemini 3.1 Flash Live 视频分析能力</b></div><ul class="notion-list notion-list-disc notion-block-f5c05132237e4234818afac383664427"><li>内容：原生视频流输入，实时分析，$0.25/M tokens 低成本</li></ul><ul class="notion-list notion-list-disc notion-block-7d7e5c73ac804af7a067c0477b62a97a"><li>与攀岩动作分析 app 的相关性：<b>极高</b> — 直接可用于「视频上传 → 实时帧分析 → 动作反馈」pipeline，成本可控</li></ul><ul class="notion-list notion-list-disc notion-block-70a1a821334f4607b91291adc7234f9c"><li>可迁移到项目的点：用 Flash Live API 搭建 MVP，验证核心功能可行性</li></ul><ul class="notion-list notion-list-disc notion-block-974eadeb6f8544f2a03f21666b44cbfd"><li>优先级：<b>高</b></li></ul><ul class="notion-list notion-list-disc notion-block-9f9f8966963c4b09b3d812f0d1b92e94"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://deepmind.google/models/model-cards/gemini-3-1-flash-live">deepmind.google/models/model-cards/gemini-3-1-flash-live</a></li></ul><div class="notion-text notion-block-e821075fc7f34da990bfcf66f0ede867"><b>Sports Action Spotting（arXiv 综述方向）</b></div><ul class="notion-list notion-list-disc notion-block-668561680f3e49ebb3d8d73908209d60"><li>内容：Temporal Action Localization（TAL）、Action Spotting（AS）、Precise Event Spotting（PES）的 CNN/Transformer 架构综述，含实时运动员追踪和姿态估计</li></ul><ul class="notion-list notion-list-disc notion-block-9817bfe47dfb476280c3d571e7d2c292"><li>与攀岩动作分析 app 的相关性：<b>中等</b> — 攀岩动作切分 = 运动 action spotting，方法直接可用</li></ul><ul class="notion-list notion-list-disc notion-block-7980ed76a3804430915c45742f596a2e"><li>可迁移到项目的点：Precise Event Spotting 方法定位「关键动作节点」（如完成特定 move 的瞬间）</li></ul><ul class="notion-list notion-list-disc notion-block-730ea4827a5f4e3985cbbd1b406728b5"><li>优先级：<b>中</b></li></ul><ul class="notion-list notion-list-disc notion-block-d628f55998954b3faa6e4c2475feb3c2"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/html/2505.03991v1">arXiv Action Spotting 综述</a></li></ul><hr class="notion-hr notion-block-012f25974c2d42fcbefa799c94784aba"/><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-29fa8cf5d67644748e8f2c0054a168c4" data-id="29fa8cf5d67644748e8f2c0054a168c4"><span><div id="29fa8cf5d67644748e8f2c0054a168c4" class="notion-header-anchor"></div><a class="notion-hash-link" href="#29fa8cf5d67644748e8f2c0054a168c4" title="D. 产品化 / 商业化 / 行业动态"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">D. 产品化 / 商业化 / 行业动态</span></span></h3><div class="notion-text notion-block-7034959cedcd48be91bc739baa3ac2f3"><b>「前沿模型差距快速收窄」趋势确认</b></div><ul class="notion-list notion-list-disc notion-block-083ce0f8f2e04c4f81897654fa7f2d14"><li>动态：GPT-5.4、Gemini 3.1 Pro、Claude Opus 4.6 在 Artificial Analysis 指数上并列前三（57 分左右），实际任务差异越来越小</li></ul><ul class="notion-list notion-list-disc notion-block-207e473b66354d23a8c874086445c7bc"><li>背后的趋势判断：模型本身不再是护城河，产品体验、工具链集成、垂直场景优化才是真正的竞争力</li></ul><ul class="notion-list notion-list-disc notion-block-e41c94afa17747168201e2badef2ca96"><li>对 side project / 求职 / 项目方向的启发：<b>做垂直 app（如攀岩分析）远比拼通用模型有意义</b>；面试时可说「我理解模型同质化趋势，因此专注于 application layer 的差异化」</li></ul><ul class="notion-list notion-list-disc notion-block-521d4e954c4f4055a4a70f2e87e3924c"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://llm-stats.com/llm-updates">LLM Stats AI Updates</a></li></ul><div class="notion-text notion-block-e20168b5561a498da3f7e367c198c609"><b>Apple Siri × Gemini 深度整合（iOS 26.4，2026-03）</b></div><ul class="notion-list notion-list-disc notion-block-ead5ddb674f3464b9ca6ef23dd1663c6"><li>动态：Siri 通过 Private Cloud Compute 调用 1.2T 参数 Gemini，实现跨 app 感知和屏幕理解</li></ul><ul class="notion-list notion-list-disc notion-block-0a7f77d0fae340178dd8407a8a4ab9e4"><li>背后的趋势判断：AI assistant 从「对话框」进化为「操作系统级别的 agent」；端侧 AI 体验被重新定义</li></ul><ul class="notion-list notion-list-disc notion-block-e5093229c62c46fea2cf006f96352bfb"><li>对 side project / 求职 / 项目方向的启发：iOS app 内 AI 功能开发的门槛进一步降低；攀岩 app 可以利用 Siri/Gemini 能力做 on-device 分析</li></ul><ul class="notion-list notion-list-disc notion-block-63a6f83fc91944f9aac4984a48574b15"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.mean.ceo/ai-product-launches-news-march-2026/">AI Product Launches Blog</a></li></ul><div class="notion-text notion-block-8da4e790d5cd46478759bb0d625336b9"><b>OpenAI $1100 亿融资，全球 AI 基础设施扩张</b></div><ul class="notion-list notion-list-disc notion-block-a3a961ca1c5147a49f1c3d67c5ff8fc9"><li>动态：OpenAI 完成约 1100 亿美元融资轮（待验证具体金额），用于全球 AI 访问基础设施建设</li></ul><ul class="notion-list notion-list-disc notion-block-2c719fd8c75a43dc9e43f0f469498ff0"><li>背后的趋势判断：AI 基础设施投资正进入「建设超级工厂」阶段，推理成本将持续下降</li></ul><ul class="notion-list notion-list-disc notion-block-b99b32fd4bc74e5b96d8faa9a870aebe"><li>对 side project / 求职 / 项目方向的启发：2026-2027 年 API 成本将大幅降低，现在不因成本而砍掉的功能设计，未来可能都能做到</li></ul><ul class="notion-list notion-list-disc notion-block-99e5fd0ef28e4d9381b3bb2f7243eefb"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.digitalapplied.com/blog/march-2026-ai-roundup-month-that-changed-everything">Digital Applied - March 2026 Roundup</a></li></ul><div class="notion-text notion-block-6f095de86e48461e96d48da405a26005"><b>Mistral Voxtral + Leanstral 开源策略</b></div><ul class="notion-list notion-list-disc notion-block-e30afa14d1084069bb993300b81fefde"><li>动态：Mistral 同周发布 Voxtral（TTS）和 Leanstral（6B Lean 4 形式化验证 agent），持续用高质量开源冲击闭源生态</li></ul><ul class="notion-list notion-list-disc notion-block-1798a647516846099dfd2e08cfff190a"><li>背后的趋势判断：开源模型在特定垂直场景（语音、数学推理、代码）已追平甚至超越闭源；开源不再是妥协方案</li></ul><ul class="notion-list notion-list-disc notion-block-8d8904d4cc304fca9d7ac2bdd0f0afe3"><li>对 side project / 求职 / 项目方向的启发：开源优先的技术栈是 cost-effective 的创业选择</li></ul><ul class="notion-list notion-list-disc notion-block-d03f47e3d7664f77aa7afed719c511eb"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://mistral.ai/news">mistral.ai/news</a></li></ul><hr class="notion-hr notion-block-fe597c54a6304091aa1e8359532c9e8f"/><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-6196f51c7a1f4ddda9694ae88674289f" data-id="6196f51c7a1f4ddda9694ae88674289f"><span><div id="6196f51c7a1f4ddda9694ae88674289f" class="notion-header-anchor"></div><a class="notion-hash-link" href="#6196f51c7a1f4ddda9694ae88674289f" title="E. 学习价值 / 求职价值"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">E. 学习价值 / 求职价值</span></span></h3><div class="notion-text notion-block-e4814a77ce474037964cf398902159c3"><b>VideoSeek + UniMotion 论文</b></div><ul class="notion-list notion-list-disc notion-block-f66f7bf330b64da3ae761d6b307fed01"><li>内容：长视频 agent 框架 + 统一运动理解/生成框架</li></ul><ul class="notion-list notion-list-disc notion-block-58adf103f44d4b4893714b6704f0981a"><li>适合我怎么用：精读 + 复现（VideoSeek 优先）；面试表达「我了解视频 agent 的 think-act-observe 架构和 motion-language alignment 方法」</li></ul><ul class="notion-list notion-list-disc notion-block-11c744cbd37c4c15b60e8b6f0da1db36"><li>推荐动作：收藏两篇论文；先精读 VideoSeek 的 framework 部分；UniMotion 的 CMA-VAE 结构作为攀岩项目中期升级参考</li></ul><ul class="notion-list notion-list-disc notion-block-212fddf9b7114a96aac7afcd3198e127"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://papers.cool/arxiv/2603.20185">papers.cool/arxiv/2603.20185</a></li></ul><div class="notion-text notion-block-fac3214a984845fd8b62a09c53cc588e"><b>Memory Sparse Attention（线性复杂度长上下文）</b></div><ul class="notion-list notion-list-disc notion-block-8b8f8baec843416389896f0d4b914e39"><li>内容：线性复杂度 attention，支持超长序列，面向 LLM serving 和长视频分析</li></ul><ul class="notion-list notion-list-disc notion-block-50f73707d027472c8eb23f8f463156c7"><li>适合我怎么用：收藏精读；面试表达「我了解 sparse attention 变体在长上下文处理中的关键 tradeoff」</li></ul><ul class="notion-list notion-list-disc notion-block-6a664f77f3aa4077aa0f7f2221718277"><li>推荐动作：精读一遍，理解与 FlashAttention/Longformer 的区别；写一篇技术笔记</li></ul><ul class="notion-list notion-list-disc notion-block-90b713f08ac747dd9f8d3d4b753c4712"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/papers/date/2026-03-26">HuggingFace Papers March 26</a></li></ul><div class="notion-text notion-block-584f0ec9651c48b7b218c9c3b6aa7627"><b>Gemini 3.1 Flash Live API 实践</b></div><ul class="notion-list notion-list-disc notion-block-2fa50cd8bc664ccb8b89f18809dc4645"><li>内容：低成本实时多模态 API，直接可用于视频流分析</li></ul><ul class="notion-list notion-list-disc notion-block-a04cfb42833c4a83a1fac400f136f9fc"><li>适合我怎么用：立刻试用；做一个小 demo（上传攀岩短视频 → 调用 API → 输出动作描述）放进 portfolio</li></ul><ul class="notion-list notion-list-disc notion-block-16d7fe5d6a6f41329e4a99156a35c78a"><li>推荐动作：今天注册 API key，跑通官方 quickstart；这个 demo 可以直接写进简历「Built video analysis pipeline using Gemini 3.1 Flash Live&quot;</li></ul><ul class="notion-list notion-list-disc notion-block-1c01d2a774a34f79b85e94f068410cbb"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://deepmind.google/models/model-cards/gemini-3-1-flash-live">deepmind.google/models/model-cards/gemini-3-1-flash-live</a></li></ul><div class="notion-text notion-block-35fa09ff23f14ff4aab8b3a207d82a44"><b>MCP 协议深度理解</b></div><ul class="notion-list notion-list-disc notion-block-c59c095756ea4125ba243718246bc78c"><li>内容：97M 安装量、所有主流厂商支持，已成 agentic infra 标准</li></ul><ul class="notion-list notion-list-disc notion-block-998db20f79e249928ec5d5f9382e2004"><li>适合我怎么用：面试表达「我熟悉 MCP 协议设计，理解 agentic 系统中 tool use 的标准化趋势」</li></ul><ul class="notion-list notion-list-disc notion-block-10afdb0ae50a41f28e5ad4f584ae4f02"><li>推荐动作：看一遍 MCP 官方文档；在一个 side project 中实现 MCP server 接口</li></ul><ul class="notion-list notion-list-disc notion-block-e85338777871457b829ca4d11804b96f"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://modelcontextprotocol.io">modelcontextprotocol.io</a></li></ul><hr class="notion-hr notion-block-4eb03b9de21e4c159dfccb48898a107d"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-505923344d9b4fe19044cbabf5c1dfc0" data-id="505923344d9b4fe19044cbabf5c1dfc0"><span><div id="505923344d9b4fe19044cbabf5c1dfc0" class="notion-header-anchor"></div><a class="notion-hash-link" href="#505923344d9b4fe19044cbabf5c1dfc0" title="三、今日高分 GitHub Repo"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">三、今日高分 GitHub Repo</span></span></h2><div class="notion-text notion-block-d27fa2ca08544961992586c0ab289cea"><b>① OpenClaw</b></div><ul class="notion-list notion-list-disc notion-block-ece38efd687b465ab72ada76a567a89a"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://github.com/pspdfkit/openclaw">github.com/pspdfkit/openclaw</a> （⚠️ 待验证官方 repo URL）</li></ul><ul class="notion-list notion-list-disc notion-block-528265e0109043878f980684dc7502d7"><li>方向标签：agent / app / infra</li></ul><ul class="notion-list notion-list-disc notion-block-9b1c9cd5eab84843a88de845ec7bbbd9"><li>这项目是干什么的：LLM 与计算机之间的通用 agent 中间层，通过「skills」系统执行 shell/浏览器/API 任务，集成主流 IM 平台</li></ul><ul class="notion-list notion-list-disc notion-block-02cf61ff666843ec8d43767473502699"><li>为什么今天值得关注：60 天内从 0 → 335K+ stars，超越 React 成 GitHub 最多 star 项目，社区讨论度极高</li></ul><ul class="notion-list notion-list-disc notion-block-5556f55ab4124f839c29ffdf6becba47"><li>与我的相关性：agent 架构参考；可基于此快速搭建 coding agent 或任务自动化 workflow</li></ul><ul class="notion-list notion-list-disc notion-block-c66919e7b74e462d9405159345b1aad1"><li>上手成本：中</li></ul><ul class="notion-list notion-list-disc notion-block-9f684700b0834dec90a528dd9cff3951"><li>是否建议收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-eb58cdd7193843cb814e1a55e40f0513"><li>是否建议复现：是（fork 后做一个小 task automation demo）</li></ul><ul class="notion-list notion-list-disc notion-block-1af0bb7d433b4afb955c264cd04cb39e"><li>一句话判断：2026 年 agent 工具链领域最值得关注的开源项目，架构值得认真研读</li></ul><div class="notion-text notion-block-d9a9cbc9771a4a60a83b746fc694947a"><b>② VideoSeek（待 GitHub 公开）</b></div><ul class="notion-list notion-list-disc notion-block-763a5007827449249b8cb4e6a681c11f"><li>GitHub 链接：（待论文作者公开，可跟踪 arXiv 主页）</li></ul><ul class="notion-list notion-list-disc notion-block-27b628194f9d4ded8a300c4bc2b77e80"><li>方向标签：video / agent / multimodal</li></ul><ul class="notion-list notion-list-disc notion-block-edaf2e821b4e4471ab53781ac9626727"><li>这项目是干什么的：长视频理解 agent，query-aware 视频探索框架</li></ul><ul class="notion-list notion-list-disc notion-block-c1f30d6b6a4b4fb1aba200534d1ee06b"><li>为什么今天值得关注：arXiv 新鲜出炉，与攀岩 app 需求高度契合</li></ul><ul class="notion-list notion-list-disc notion-block-d35ed640a3cc439aa26b91cbc2cf2e6c"><li>与我的相关性：极高，直接服务攀岩视频分析核心功能</li></ul><ul class="notion-list notion-list-disc notion-block-a1a88139451b49f3adddfa20a6ee3d20"><li>上手成本：中</li></ul><ul class="notion-list notion-list-disc notion-block-69042d120ca1471fb08a1b426e5d9658"><li>是否建议收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-202342828a2b46a0a632fbebdce81a61"><li>是否建议复现：是（等代码公开后第一时间跑通）</li></ul><ul class="notion-list notion-list-disc notion-block-e40c1282f7c441e8ba624f27eb9d103b"><li>一句话判断：视频 agent 方向必看论文，代码一旦公开立刻复现</li></ul><div class="notion-text notion-block-1a32300bfc79443ebf1bab21ebeb35cb"><b>③ UniMotion（待 GitHub 公开）</b></div><ul class="notion-list notion-list-disc notion-block-c61b4da73cc94fc7b3e53dc4c71c5b60"><li>GitHub 链接：（跟踪 arXiv 主页）</li></ul><ul class="notion-list notion-list-disc notion-block-b7184cbc87ff475285a31e60067307a7"><li>方向标签：video / motion / multimodal / training</li></ul><ul class="notion-list notion-list-disc notion-block-f2e462d649c84f6e8c5c9fcfa915aff3"><li>这项目是干什么的：统一人体运动理解与生成框架，motion-language-RGB 三模态对齐</li></ul><ul class="notion-list notion-list-disc notion-block-508eca5d77384450843c5065f1935e0d"><li>为什么今天值得关注：攀岩动作分析 app 的理想技术底座之一</li></ul><ul class="notion-list notion-list-disc notion-block-92773cc3bcfb4e278c0fd686795f282c"><li>与我的相关性：极高，motion → language → feedback 链路完整</li></ul><ul class="notion-list notion-list-disc notion-block-a04d4e9c133d4bc0925ccb5f8529796b"><li>上手成本：高</li></ul><ul class="notion-list notion-list-disc notion-block-f8657382714c40ecb3d3961298afc4e6"><li>是否建议收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-701f4ce632bf47a8940edbe7291554f1"><li>是否建议复现：中期计划（先理解框架，数据不多时考虑 fine-tune）</li></ul><ul class="notion-list notion-list-disc notion-block-b34ceecb5676409a9f75944a962b5fd3"><li>一句话判断：motion AI 方向的重要论文，列入项目 roadmap</li></ul><div class="notion-text notion-block-beec5067171e4d8197c95f56d09ebeba"><b>④ LangChain（里程碑：100K stars）</b></div><ul class="notion-list notion-list-disc notion-block-b6862d438c2147f380fae44386911872"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://github.com/langchain-ai/langchain">github.com/langchain-ai/langchain</a></li></ul><ul class="notion-list notion-list-disc notion-block-22a690829e194ac39add586a47009ab4"><li>方向标签：agent / infra / RAG / dev tools</li></ul><ul class="notion-list notion-list-disc notion-block-3b964ea8bad848fab03647c598f048b5"><li>这项目是干什么的：LLM 应用开发框架，RAG/Agent/Chain 工具链标准库</li></ul><ul class="notion-list notion-list-disc notion-block-7e471dde491445009a92c192c3ae539e"><li>为什么今天值得关注：突破 100K stars，GitHub 历史增速最快 dev tools 之一</li></ul><ul class="notion-list notion-list-disc notion-block-2447e0b53335451c8352b2f210d5640a"><li>与我的相关性：高，agent 项目开发的基础工具</li></ul><ul class="notion-list notion-list-disc notion-block-9278dea9e48a49a99a570148db3ddc92"><li>上手成本：低</li></ul><ul class="notion-list notion-list-disc notion-block-0df009819c11430f9e2321bebf2a54bb"><li>是否建议收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-f9520959d1394a3398d7a5df304c93b6"><li>是否建议复现：已很成熟，重点是跟上新 feature（v0.3+ 的 LangGraph）</li></ul><ul class="notion-list notion-list-disc notion-block-a9db58c29a7445bab89257717b00bd6d"><li>一句话判断：agent 工程必备，重点关注 LangGraph 的状态机 agent 设计</li></ul><div class="notion-text notion-block-766b59e9d5384e38a4fdb0f5422a8825"><b>⑤ MinerU2.5（HuggingFace）</b></div><ul class="notion-list notion-list-disc notion-block-d7529dfbf20d418d92d1a25406b1a8b9"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://github.com/opendatalab/MinerU">github.com/opendatalab/MinerU</a></li></ul><ul class="notion-list notion-list-disc notion-block-5fb975f07f074d458c562e09a8ac42be"><li>方向标签：RAG / infra / deployment</li></ul><ul class="notion-list notion-list-disc notion-block-a4030cbce96d479e980505f3cf9ad5b1"><li>这项目是干什么的：1.2B 参数文档解析 VLM，支持表格/公式/多栏版面结构化提取</li></ul><ul class="notion-list notion-list-disc notion-block-ae5bbc4662fb45eb9c05a5e6c4ecd8bf"><li>为什么今天值得关注：RAG pipeline 中文档预处理的 SOTA 开源方案，HuggingFace 上近日热门</li></ul><ul class="notion-list notion-list-disc notion-block-ac35ca99d34f4825b8551428109687a1"><li>与我的相关性：中（做 RAG 类项目时直接用）</li></ul><ul class="notion-list notion-list-disc notion-block-63bbf8e98047411dbebf85002b5aa388"><li>上手成本：低</li></ul><ul class="notion-list notion-list-disc notion-block-7f814281499743aeb03402da1d301bdf"><li>是否建议收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-46a1450699c24106bc3b66fd686c7520"><li>是否建议复现：可以，文档完整，pip install 即用</li></ul><ul class="notion-list notion-list-disc notion-block-b0a9594a2b2c43a4aff6cf7b5e96c055"><li>一句话判断：RAG 项目文档处理的最佳开源选择，收藏备用</li></ul><div class="notion-text notion-block-8a95f7ab055b40589a47e6673d8205a7"><b>⑥ awesome-ai-agents-2026</b></div><ul class="notion-list notion-list-disc notion-block-7103e12e167b4081953ff93745956798"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://github.com/caramaschiHG/awesome-ai-agents-2026">github.com/caramaschiHG/awesome-ai-agents-2026</a></li></ul><ul class="notion-list notion-list-disc notion-block-96f83a1ec22d48a0a2cf4860896bb5cf"><li>方向标签：agent / app</li></ul><ul class="notion-list notion-list-disc notion-block-b745901368bc4792a1236117c1fcdd62"><li>这项目是干什么的：300+ agent 相关资源合集，20+ 分类，每月更新</li></ul><ul class="notion-list notion-list-disc notion-block-be27df83559749728cf1e09286029064"><li>为什么今天值得关注：快速浏览 agent 生态全貌的最高效方式</li></ul><ul class="notion-list notion-list-disc notion-block-81f7f50c9c6c430ea7795d73ea6f2837"><li>与我的相关性：中，帮助快速找到 agent 领域值得参考的项目</li></ul><ul class="notion-list notion-list-disc notion-block-2d6c7710aa104e9f84aed25930cce42f"><li>上手成本：低（纯浏览）</li></ul><ul class="notion-list notion-list-disc notion-block-5da0c4d014a9443bacd46be8c5341a1a"><li>是否建议收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-f825abaf89914dfcb0e3ba3fdc9e7d91"><li>是否建议复现：否（是资源合集）</li></ul><ul class="notion-list notion-list-disc notion-block-242826c8e0404b5fa72c45a43dc195e7"><li>一句话判断：agent 选型前必逛一次</li></ul><hr class="notion-hr notion-block-35cb112af4344ed6bf75b67010e2cd93"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-d4a32425b7724de8848bf5e302e06e0b" data-id="d4a32425b7724de8848bf5e302e06e0b"><span><div id="d4a32425b7724de8848bf5e302e06e0b" class="notion-header-anchor"></div><a class="notion-hash-link" href="#d4a32425b7724de8848bf5e302e06e0b" title="四、今日最值得看的 3 个链接"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">四、今日最值得看的 3 个链接</span></span></h2><div class="notion-text notion-block-52bced9198d0462592abf8da79b86780"><b>① VideoSeek 论文</b></div><div class="notion-text notion-block-603fd57147f1450697a241a7d9687d4a">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://papers.cool/arxiv/2603.20185">papers.cool/arxiv/2603.20185</a></div><div class="notion-text notion-block-ff58c190b68843eea94cd982f7488f5c">为什么今天最值得点开：直接解决攀岩 app 最核心的技术问题「如何在长视频中高效定位关键动作」，框架清晰可复现，今天就应该读完 abstract + method</div><div class="notion-text notion-block-046281284dc84d28a62b2a68fe8778df"><b>② Gemini 3.1 Flash Live Model Card + Quickstart</b></div><div class="notion-text notion-block-676aece5a68b4af59ce2723dbdc276d1">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://deepmind.google/models/model-cards/gemini-3-1-flash-live">deepmind.google/models/model-cards/gemini-3-1-flash-live</a></div><div class="notion-text notion-block-270120ab69494d5492e4f4a6efa37f11">为什么今天最值得点开：你的攀岩分析 app MVP 的 API 方案就在这里，今天可以跑通第一个视频分析 demo，portfolio 立刻有新内容</div><div class="notion-text notion-block-06769eddfc064332a22a2dc0bcb39de6"><b>③ Mistral Voxtral TTS 发布页</b></div><div class="notion-text notion-block-10900740e7174f538054dbe4ca957c0d">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://mistral.ai/news">mistral.ai/news</a></div><div class="notion-text notion-block-312f3d02fbd44df8b89101d71f02af6d">为什么今天最值得点开：轻量开源 TTS，攀岩 app 的「语音反馈」功能可以直接基于此构建，边缘部署可行，Apache 2.0 无商用顾虑</div><hr class="notion-hr notion-block-eb5eccf0fea3421a93abaaa81d1f8377"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-0a4b4d0796cd4fbfab38f23139df080f" data-id="0a4b4d0796cd4fbfab38f23139df080f"><span><div id="0a4b4d0796cd4fbfab38f23139df080f" class="notion-header-anchor"></div><a class="notion-hash-link" href="#0a4b4d0796cd4fbfab38f23139df080f" title="五、今日行动清单"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">五、今日行动清单</span></span></h2><div class="notion-text notion-block-93087132537545b48bc3b2014e95d345"><b>1. 今天值得收藏但不必立刻看的</b></div><ul class="notion-list notion-list-disc notion-block-25364985a99d49589b8902232fc5874e"><li>awesome-ai-agents-2026 GitHub 合集</li></ul><ul class="notion-list notion-list-disc notion-block-e2108c5aadf8432bbeca822b5549cf40"><li>LangGraph 最新文档（重点：状态机 agent 设计）</li></ul><ul class="notion-list notion-list-disc notion-block-4202c91c9f9b476e9e65f1b8052e8f75"><li>WildWorld 数据集页面（攀岩数据集设计参考）</li></ul><ul class="notion-list notion-list-disc notion-block-1f738dd06f0a4055a9184ac41e177bf1"><li>VideoDetective 论文（等 VideoSeek 消化完再看）</li></ul><div class="notion-text notion-block-6b6ffc346f6248ad9b04e18f54686485"><b>2. 今天值得精读的</b></div><ul class="notion-list notion-list-disc notion-block-c059f05ef05c408ebc9b06177f3a6940"><li>VideoSeek 论文（重点：framework 设计 + experiment 部分）</li></ul><ul class="notion-list notion-list-disc notion-block-c6e97c31cecd45c5a4e65a7ae299c432"><li>Memory Sparse Attention 论文（理解线性复杂度 attention 的 tradeoff）</li></ul><div class="notion-text notion-block-ffe65fa42ebb43b5b0fed041aecea31a"><b>3. 今天值得复现/试用的</b></div><ul class="notion-list notion-list-disc notion-block-4c3405ffa10a4b47926dfdaab3f06618"><li><b>立刻做</b>：Gemini 3.1 Flash Live API quickstart → 上传一段攀岩视频 → 看输出质量</li></ul><ul class="notion-list notion-list-disc notion-block-343d5debebbe459eae2ca212a4c2d5dc"><li><b>本周做</b>：MinerU2.5 pip install，测试文档解析效果</li></ul><ul class="notion-list notion-list-disc notion-block-0e9ec12ad3eb42afbf23f70812abad13"><li><b>等代码公开后</b>：VideoSeek 复现</li></ul><div class="notion-text notion-block-6d4f888061854b10bddbb8bcf748b6da"><b>4. 今天值得记到项目 roadmap 的</b></div><ul class="notion-list notion-list-disc notion-block-ca6ef17b1fe04fd0a0bc83cc6a823336"><li>攀岩 app 视频分析 backbone：Gemini 3.1 Flash Live（短期 MVP）→ UniMotion fine-tune（中期升级）</li></ul><ul class="notion-list notion-list-disc notion-block-99b41c29654c4609a734e813c1ee9ba1"><li>长视频定位模块：参考 VideoSeek think-act-observe 框架</li></ul><ul class="notion-list notion-list-disc notion-block-f920747b2d8d4558ac2c87d042097543"><li>语音反馈模块：Mistral Voxtral TTS（边缘部署方案）</li></ul><ul class="notion-list notion-list-disc notion-block-747c29c1b7fd47bea33fbdd2a9b360d9"><li>Agent 工具层：研究 OpenClaw 架构，考虑用于任务编排</li></ul><div class="notion-text notion-block-56e292ef85ef473582387ec855f9b69b"><b>5. 今天面试里可以拿来讲的 1~2 个点</b></div><ul class="notion-list notion-list-disc notion-block-1ea28fb840f7433ebbb9734b64346e93"><li><b>点 1（技术深度）</b>：「我在研究 VideoSeek 提出的视频 agent 框架，它用 think-act-observe 循环 + 多粒度 toolkit 解决长视频中的 query-aware 片段定位问题，我正在将这个框架应用到我的攀岩动作分析项目中。」</li></ul><ul class="notion-list notion-list-disc notion-block-23b1ea1601214c21b941a687bb19f03d"><li><b>点 2（行业判断）</b>：「2026 年初的一个核心趋势是前沿模型能力快速趋同，真正的差异化在 application layer。以 Gemini 3.1 Flash Live 为例，低成本实时视频分析 API 的出现使得之前不可行的垂直视频应用变得商业可行，这正是我做攀岩分析 app 的时机判断依据。」</li></ul><hr class="notion-hr notion-block-59cf2dc43b7c48eca2c3e6290edf8870"/><div class="notion-text notion-block-65a98ed582c14eb29e81f49cfd53af7f"><em>📌 本日报由 AI 自动生成 | 2026-03-29 | 信息来源：官方博客、arXiv、HuggingFace Papers、GitHub Trending</em></div></main></div>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[AI 日报 | 2026-03-27]]></title>
            <link>https://dundun0504.com/article/ai-daily-2026-03-27</link>
            <guid>https://dundun0504.com/article/ai-daily-2026-03-27</guid>
            <pubDate>Fri, 27 Mar 2026 00:00:00 GMT</pubDate>
            <description><![CDATA[2026-03-27 AI 日报：GPT-5.4 vs Claude Opus 4.6 编程能力深度拆解、DeepSeek V4 万亿参数开源冲击、NVIDIA Nemotron 3 Super 最强开源推理模型、LTX-2.3 开源 4K 视频生成、Cursor 并行子 Agent 正式落地、Mobile-VideoGPT 轻量边缘视频理解。]]></description>
            <content:encoded><![CDATA[<div id="notion-article" class="mx-auto overflow-hidden "><main class="notion light-mode notion-page notion-block-330670e5549981b48358d7de6d9a2407"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-be8577fe241a401d8784a675a0cef1c6" data-id="be8577fe241a401d8784a675a0cef1c6"><span><div id="be8577fe241a401d8784a675a0cef1c6" class="notion-header-anchor"></div><a class="notion-hash-link" href="#be8577fe241a401d8784a675a0cef1c6" title="一、今日最重要的 5 条"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">一、今日最重要的 5 条</span></span></h2><div class="notion-text notion-block-7b4a6d10a7ab4d7987988c368c6f80ae"><b>① GPT-5.4 vs Claude Opus 4.6：编程能力全面拆解（2026-03-05 / 02-05）</b></div><div class="notion-text notion-block-8100ea4b1eec4be4b4310054b2d9741c">OpenAI GPT-5.4（3 月 5 日）和 Anthropic Claude Opus 4.6（2 月 5 日）完成了本轮「顶级模型」双雄格局确立。SWE-Bench Verified：Claude 80.8% vs GPT-5.4 77.2%，Claude 标准编程占优；但 SWE-Bench Pro（抗数据污染版）GPT-5.4 以 57.7% 反超 Claude 约 45.9%。两者 1M token context 窗口已成新标准，AI 工程/coding agent 选型必须基于具体 use case 而非单一 benchmark。</div><div class="notion-text notion-block-c95fa6e94e4c408898818a71c21878cd">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.datacamp.com/blog/gpt-5-4-vs-claude-opus-4-6">DataCamp 深度对比</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://evolink.ai">evolink.ai</a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://evolink.ai/blog/swe-bench-verified-2026-claude-vs-gpt"> SWE-Bench 解读</a></div><hr class="notion-hr notion-block-812f9809d96640f7964a481e05429e97"/><div class="notion-text notion-block-55b187c14545478e9ae926fdcc998969"><b>② DeepSeek V4 开源万亿参数多模态大模型（2026-03-03）</b></div><div class="notion-text notion-block-63c398a928144a9ca02ac0747600e480">DeepSeek V4 于 3 月 3 日发布，MODEL1 架构：约 1 万亿总参数，每 token 激活约 37B，原生支持文本/图像/视频/音频，context 超 100 万 token。KV cache 分层优化带来 40% 内存节省 + 1.8x 推断加速，专门针对华为昇腾芯片优化，证明中国生态可独立训练前沿模型。开源协议可商用。</div><div class="notion-text notion-block-e8bbac6ece0545f1978459f469a252cb">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://qverlabs.com/blog/deepseek-v4-trillion-parameter-multimodal-ai">详细分析</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.nxcode.io/resources/news/deepseek-v4-release-specs-benchmarks-2026">NxCode 规格解读</a></div><hr class="notion-hr notion-block-f5a1c16a9eb24e6bb59ed310c62c8b20"/><div class="notion-text notion-block-fa18ddff52824d4b81a9071cdaa9621f"><b>③ NVIDIA Nemotron 3 Super：开源最强 coding 模型（2026-03-11, GTC）</b></div><div class="notion-text notion-block-b51297af901f4bdca3b55aa0b150fb8c">GTC 2026 发布，120B 总参数 / 12B 激活参数，Mamba-2 + MoE 混合架构。SWE-Bench Verified 60.47%，开源权重排名第一，比 Qwen3.5-122B 推断吞吐高 7.5x，478 tokens/sec。完全开源（权重 + 数据集 + 训练方案），已上线 Hugging Face / OpenRouter（免费试用）。</div><div class="notion-text notion-block-ae3d8034e0994183928c031cd4c9a42a">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blogs.nvidia.com/blog/nemotron-3-super-agentic-ai/">NVIDIA 官方博客</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://openrouter.ai/nvidia/nemotron-3-super-120b-a12b:free">OpenRouter 免费入口</a></div><hr class="notion-hr notion-block-de6edaa103b34509be93bca0b4ba6e0b"/><div class="notion-text notion-block-0fb52390e3e0454ab366db6ae0d5f37a"><b>④ LTX-2.3：开源 4K@50FPS 视频生成，首次原生同步音频（2026-03-05）</b></div><div class="notion-text notion-block-6c3c7699404c4691a2edc62e51aa62a3">Lightricks 发布 LTX-2.3，22B 参数，Apache 2.0 商用许可。首个开源模型单 pass 同时生成视频帧 + 同步音频；原生竖版视频支持；单次生成 20 秒 4K 片段。可本地运行于消费级硬件。对攀岩 app：可生成「标准动作」示范对比视频，解决标注数据稀缺问题。</div><div class="notion-text notion-block-225b8cea8a8c4d19bc5528880cf64f1e">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.digitalapplied.com/blog/ltx-2-3-open-source-ai-video-generation-synchronized-audio">发布概述</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/Lightricks/LTX-Video">Hugging Face</a></div><hr class="notion-hr notion-block-126413b775cd43b7acb19ceab251a51f"/><div class="notion-text notion-block-9d9305fb7d1f4cd9813d0505f137c00f"><b>⑤ Cursor 并行子 Agent + BugBot 进入生产（2026-02-末~03）</b></div><div class="notion-text notion-block-d6ac1e6b584e416e9b124781f50bf84a">最多 8 个独立 cloud VM 并行执行，Git worktree 隔离，30 秒内完成多数任务。BugBot 从「发现问题」升级为「自动修复」——检测 PR bug 后自动启动 cloud agent 修复并提交，35% 的 Autofix 建议被直接 merge。这是 agentic coding 从 demo 进入生产的最清晰信号。</div><div class="notion-text notion-block-4c8194fa863346ecbaf1bcaf10e7960a">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://aitoolanalysis.com/cursor-ai-review/">Cursor AI Review 2026</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://cursor.com/changelog/page/2">更新日志</a></div><hr class="notion-hr notion-block-718ed8c6a7dc48648841f7fa5b219d97"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-69924e59c2634fb1aef964fb45eeba28" data-id="69924e59c2634fb1aef964fb45eeba28"><span><div id="69924e59c2634fb1aef964fb45eeba28" class="notion-header-anchor"></div><a class="notion-hash-link" href="#69924e59c2634fb1aef964fb45eeba28" title="二、按我的目标分类"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">二、按我的目标分类</span></span></h2><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-5fc901d6ca1f498d9b3567b8fb6bbcdb" data-id="5fc901d6ca1f498d9b3567b8fb6bbcdb"><span><div id="5fc901d6ca1f498d9b3567b8fb6bbcdb" class="notion-header-anchor"></div><a class="notion-hash-link" href="#5fc901d6ca1f498d9b3567b8fb6bbcdb" title="A. 前沿模型 / 一手发布"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">A. 前沿模型 / 一手发布</span></span></h3><div class="notion-text notion-block-66447e502b404be89bb0d6bcc7d4d814"><b>GPT-5.4「Thinking」</b></div><ul class="notion-list notion-list-disc notion-block-7e87b2f15b5d420a811eba775f6db640"><li>事件：OpenAI 3 月 5 日发布，含「Thinking」变体，内部定位为 GPT-6 级推理能力的紧凑版</li></ul><ul class="notion-list notion-list-disc notion-block-3058917dd2da44799aa015b5654f5bd1"><li>核心内容：1M context，128K max output，SWE-Bench Verified 77.2%，已通过 OpenRouter 和 OpenAI API 开放</li></ul><ul class="notion-list notion-list-disc notion-block-c1d412d08dab4270b214a02f155d42b8"><li>为什么重要：「小模型达到更大模型推理水平」是 2026 年核心架构趋势，对推断成本有直接影响</li></ul><ul class="notion-list notion-list-disc notion-block-9d05b13256ce4d109dd8167389edea3b"><li>我需不需要点开：需要，重点测试 Thinking 模式的 coding 能力</li></ul><ul class="notion-list notion-list-disc notion-block-96a81b17cf34402db118333fd5bc4e42"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://portkey.ai/blog/gpt-5-4-vs-claude-opus-4-6/">Portkey GPT-5.4 vs Claude 对比</a></li></ul><div class="notion-text notion-block-3c39866dd93f4b12b3e86fbe8e69286a"><b>Claude Opus 4.6</b></div><ul class="notion-list notion-list-disc notion-block-33fa876e617d4b2298e7665023dee13b"><li>事件：Anthropic 2 月 5 日发布，SWE-Bench Verified 80.8%，当前商业模型编程最高分</li></ul><ul class="notion-list notion-list-disc notion-block-b502cb10663c43ff8887b69a745ae9c7"><li>核心内容：1M context，扩展思维模式支持复杂多步推理</li></ul><ul class="notion-list notion-list-disc notion-block-2eac202e437743b38c1ab64c463d4d97"><li>为什么重要：这是目前给 coding agent 任务选模型的首选依据</li></ul><ul class="notion-list notion-list-disc notion-block-801b0974575f464287f30f3346407c60"><li>我需不需要点开：是，直接上手测试攀岩 app 的代码生成质量</li></ul><ul class="notion-list notion-list-disc notion-block-34d62c6b74e44e1cb3e56cfbf70934e3"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.mindstudio.ai/blog/gpt-54-vs-claude-opus-46-vs-gemini-31-pro-benchmarks">MindStudio benchmark 对比</a></li></ul><div class="notion-text notion-block-9a06382dbd914689b1965d53dadb1c48"><b>Gemini 3.1 Flash-Lite</b></div><ul class="notion-list notion-list-disc notion-block-8510603e60dc470a9e1ccfc4e5042559"><li>事件：Google DeepMind，2026 年 2-3 月</li></ul><ul class="notion-list notion-list-disc notion-block-4266fd12667a440dadc41351619de983"><li>核心内容：2.5x 更快，$0.25/M tokens，多模态全覆盖</li></ul><ul class="notion-list notion-list-disc notion-block-67c8ed1ca4ad4c809fa411355426ace6"><li>为什么重要：视频理解 API 的高性价比选项，攀岩 app 的模型选型候选</li></ul><ul class="notion-list notion-list-disc notion-block-2f5477f5c1a045da857a4464a3bdd268"><li>我需不需要点开：中等，关注视频 token 限制和单次处理时长上限</li></ul><ul class="notion-list notion-list-disc notion-block-4708b6868d804077a5c1da840e8486b4"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://llm-stats.com/ai-news">LLM Stats 3 月汇总</a></li></ul><div class="notion-text notion-block-e0f06222f4324f66a5e38634d861bcca"><b>NVIDIA Nemotron 3 Super 120B</b></div><ul class="notion-list notion-list-disc notion-block-bb9f992376c94b3ea8771e63949ad911"><li>事件：GTC 2026，3 月 11 日</li></ul><ul class="notion-list notion-list-disc notion-block-20681b80073c45c4b53e298eeacaf441"><li>核心内容：开源权重，Mamba-2 + MoE 混合架构，SWE-Bench 60.47%，7.5x 推断吞吐优势</li></ul><ul class="notion-list notion-list-disc notion-block-5a8fff4ae99f411a8a040dc09c4f7ed6"><li>为什么重要：开源最强 coding 模型，可私有部署，适合对数据隐私有要求的项目</li></ul><ul class="notion-list notion-list-disc notion-block-54a60d8e3ae742dd8d78ec7816259501"><li>我需不需要点开：是，尤其关注 OpenRouter 免费试用入口</li></ul><ul class="notion-list notion-list-disc notion-block-794f30cc2ba94213b65411665e36f596"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/nvidia/NVIDIA-Nemotron-3-Super-120B-A12B-BF16">HuggingFace 模型页</a></li></ul><hr class="notion-hr notion-block-0e33d34b4f1f479591e73b938b967bc6"/><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-54c80df78c4b4199a10c237bd5b731d1" data-id="54c80df78c4b4199a10c237bd5b731d1"><span><div id="54c80df78c4b4199a10c237bd5b731d1" class="notion-header-anchor"></div><a class="notion-hash-link" href="#54c80df78c4b4199a10c237bd5b731d1" title="B. AI 工程 / Agent / Coding Workflow"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">B. AI 工程 / Agent / Coding Workflow</span></span></h3><div class="notion-text notion-block-1e64ab9328cc4f0ea553c1d66273621c"><b>Cursor 并行子 Agent（2026-03）</b></div><ul class="notion-list notion-list-disc notion-block-96c9a65bc0ab45d1ad47ad58f7343d3d"><li>内容：最多 8 个并行 cloud agent，独立 Ubuntu VM + Git worktree，30 秒完成多数任务</li></ul><ul class="notion-list notion-list-disc notion-block-155506bdd97d44cc8d1089eae2b783d4"><li>可落地价值：一次性并行生成多个功能 PR，code review 速度大幅提升</li></ul><ul class="notion-list notion-list-disc notion-block-b4547082527440ef8d4e719ec48fd1fc"><li>对我当前开发/学习的意义：可用于攀岩 app 的并行功能开发，显著提升个人开发效率</li></ul><ul class="notion-list notion-list-disc notion-block-c421cc33058a4e5c9f63bfabe10b83df"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://aitoolanalysis.com/cursor-ai-review/">Cursor AI Review</a></li></ul><div class="notion-text notion-block-7c02d840471c48f4bf8d2d90504ed4d1"><b>Windsurf Wave 13：Arena Mode + Plan Mode</b></div><ul class="notion-list notion-list-disc notion-block-872b2ade8b1b4e8f832722bce06ad4a5"><li>内容：Arena Mode 让两个模型并排对比（隐藏身份，用户投票），Plan Mode 增加结构化任务规划</li></ul><ul class="notion-list notion-list-disc notion-block-9e9c4d5b730f434295ec22ac300b8ec1"><li>可落地价值：对比不同模型在特定任务的真实输出质量，是最实用的模型评估方法</li></ul><ul class="notion-list notion-list-disc notion-block-9bd03b74938f4d0bb4879cc3578f4b35"><li>对我当前开发/学习的意义：可用 Arena Mode 评估 GPT-5.4 vs Claude 4.6 在攀岩动作描述任务上的差距</li></ul><ul class="notion-list notion-list-disc notion-block-928112ad565248cfadd56798f658e374"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.logrocket.com/ai-dev-tool-power-rankings/">LogRocket AI Dev Tools 排名</a></li></ul><div class="notion-text notion-block-8ed49ef23024458fbcca1a96e30b42f3"><b>Long-running Autonomous Workflows（2026 架构转变）</b></div><ul class="notion-list notion-list-disc notion-block-3687cd633b6b4ca28b6ed6e4763ba350"><li>内容：agent 从「单次响应」转为「执行循环」，支持持续运行的自主工作流</li></ul><ul class="notion-list notion-list-disc notion-block-8a81ef049799479daebc0d4672b6b57e"><li>可落地价值：攀岩 app 的「上传视频→分析→生成建议→追踪进度」完全可以设计为 long-running agent</li></ul><ul class="notion-list notion-list-disc notion-block-9fff1202c3ac4c35bc10caf54d42c7ae"><li>对我当前开发/学习的意义：理解 execution loop 架构是写进项目 roadmap 的核心概念</li></ul><ul class="notion-list notion-list-disc notion-block-71099fe3fa234c33ba509f608689701a"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://medium.com/@dave-patten/the-state-of-ai-coding-agents-2026-from-pair-programming-to-autonomous-ai-teams-b11f2b39232a">State of AI Coding Agents 2026</a></li></ul><div class="notion-text notion-block-4d8a766ac21d4ca78c343298dd1bee8f"><b>Gemini CLI（开源终端 Agent，99.2k stars）</b></div><ul class="notion-list notion-list-disc notion-block-522da1c0a27d42499b5e3578cadb5b9c"><li>内容：Google 开源，将 Gemini 直接带入终端，支持文件操作和代码执行</li></ul><ul class="notion-list notion-list-disc notion-block-c100bd741a004d44b13d891e83e89046"><li>可落地价值：轻量替代 Claude Code 的方案，适合快速原型，无需额外订阅</li></ul><ul class="notion-list notion-list-disc notion-block-68cf198c4da84000b0fd5da063910e24"><li>对我当前开发/学习的意义：多一个工具选项，可混合使用</li></ul><ul class="notion-list notion-list-disc notion-block-cc090715c14141a68790bb6185ac56bc"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/trending">GitHub Trending</a></li></ul><hr class="notion-hr notion-block-18f8db496ba04931a8195b9f7a89214e"/><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-f8bcb2d8a18f4a49b8d6eca088fce7c7" data-id="f8bcb2d8a18f4a49b8d6eca088fce7c7"><span><div id="f8bcb2d8a18f4a49b8d6eca088fce7c7" class="notion-header-anchor"></div><a class="notion-hash-link" href="#f8bcb2d8a18f4a49b8d6eca088fce7c7" title="C. 视觉 / 视频 / 运动人体分析"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">C. 视觉 / 视频 / 运动人体分析</span></span></h3><div class="notion-text notion-block-85e9c6229fba4f1ca6776dedf4976ba2"><b>Mobile-VideoGPT（arXiv 2503.21782）</b></div><ul class="notion-list notion-list-disc notion-block-2706c80eb2d1446ea26bfafb9a2bd34b"><li>内容：MBZUAI 出品，0.5B 参数视频理解 LM，1GB 模型大小，需 3GB VRAM，46 tok/sec（RTX A6000），比 LLaVA-OneVision-0.5B 快 2x+，benchmark 高 6 points</li></ul><ul class="notion-list notion-list-disc notion-block-394251264eaa434eaf8b6ed324a3d777"><li>与攀岩动作分析 app 的相关性：极高——这是目前最小最快的视频理解模型，可在 mobile/edge 运行，直接处理攀岩视频，实时推断</li></ul><ul class="notion-list notion-list-disc notion-block-6e7c532c914e415782f22feb87f23d38"><li>可迁移到项目的点：用 Mobile-VideoGPT 做「实时动作描述」模块；frame scoring 策略可优化关键帧提取，去除冗余帧</li></ul><ul class="notion-list notion-list-disc notion-block-8a0379db41ad4a1fb5f1e4a64e2d0175"><li>优先级：高</li></ul><ul class="notion-list notion-list-disc notion-block-a6bf9ebb541143b8a84163c8f9558e96"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2503.21782v1">arXiv</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/amshaker/mobile-videogpt">GitHub</a></li></ul><div class="notion-text notion-block-26fa26cbcf234673a88c803d4c857334"><b>LTX-2.3 视频生成（4K@50FPS，同步音频）</b></div><ul class="notion-list notion-list-disc notion-block-8605c3911e144fb2b3d5103503256f94"><li>内容：22B 参数，Apache 2.0，首个开源单 pass 视频+音频生成，支持本地部署</li></ul><ul class="notion-list notion-list-disc notion-block-15d5c39393154485b9887df20234dd73"><li>与攀岩动作分析 app 的相关性：中高——可生成「理想动作」示范视频用于 app 内对比展示和合成训练数据</li></ul><ul class="notion-list notion-list-disc notion-block-fb6215de159b40669afc1feb3ad15d9e"><li>可迁移到项目的点：用 LTX-2.3 生成标准攀岩动作示范，解决训练数据稀缺问题</li></ul><ul class="notion-list notion-list-disc notion-block-cb81f76874764f8b91254598156bc913"><li>优先级：中</li></ul><ul class="notion-list notion-list-disc notion-block-b596fbd3f87d400cb927387480f0a2f3"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/Lightricks/LTX-Video">Lightricks LTX-Video HuggingFace</a></li></ul><div class="notion-text notion-block-a92a181bba974e5195e391f9a6663fcd"><b>Belay AI / AscentAI：攀岩专项 AI 工具现状</b></div><ul class="notion-list notion-list-disc notion-block-2cb857a6b9414a56a2e6090404b86c37"><li>内容：Belay AI 使用 computer vision 追踪攀岩动作、分析技术并预防受伤；AscentAI 提供质心追踪、速度、流畅度、静止比等指标</li></ul><ul class="notion-list notion-list-disc notion-block-a80117666b4d490b9c4354abc875f280"><li>与攀岩动作分析 app 的相关性：直接竞品——了解现有产品技术栈和功能差距有助于定义差异化</li></ul><ul class="notion-list notion-list-disc notion-block-6816d0e9f2bc42bca2e470024000aa0d"><li>可迁移到项目的点：质心追踪 + 流畅度评分是可复现的核心指标，作为 MVP 功能目标</li></ul><ul class="notion-list notion-list-disc notion-block-26d96af7634a4dc69c14844e1bab5dbe"><li>优先级：高</li></ul><ul class="notion-list notion-list-disc notion-block-82b36e9f7aeb4d3f87881408b66347b3"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://belay.ai/">Belay AI</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://play.google.com/store/apps/details?id=com.jonasdeuchler.ascendai">AscentAI</a></li></ul><div class="notion-text notion-block-551b00bfd0b44b01a6739f82ec9e4273"><b>&quot;The Way Up&quot; 攀岩 Hold 检测数据集（arXiv 2505.12854）</b></div><ul class="notion-list notion-list-disc notion-block-5e59e8de89ef4b0087641522164533ae"><li>内容：22 个标注攀岩视频，含 hold 位置、使用顺序和时间标签，使用关键点 2D 姿态估计检测 hold 使用情况</li></ul><ul class="notion-list notion-list-disc notion-block-3f651172e65446b8953be38ca73ed2ae"><li>与攀岩动作分析 app 的相关性：极高——专门为攀岩场景构建的数据集，可直接用于训练</li></ul><ul class="notion-list notion-list-disc notion-block-99ee4e2b21ea46d4a1f52578565f4e03"><li>可迁移到项目的点：复现 hold 使用检测模块，结合姿态估计做「动作路径分析」</li></ul><ul class="notion-list notion-list-disc notion-block-2d931d85bca1495ab791cedea70deae6"><li>优先级：高</li></ul><ul class="notion-list notion-list-disc notion-block-a59514609c7d4b1387ef4b148954b3fc"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/html/2505.12854v1">arXiv</a></li></ul><div class="notion-text notion-block-9f4c1c08e3dc4d43a865697819837172"><b>MiniCPM-V 8B：手机端多模态理解</b></div><ul class="notion-list notion-list-disc notion-block-208a5dd191c54a0f886959c1ea06e913"><li>内容：8B 模型超越 GPT-4V、Gemini Pro、Claude 3，可在手机端运行，视频理解能力强</li></ul><ul class="notion-list notion-list-disc notion-block-4739b564686d4b889353777fe23e0467"><li>与攀岩动作分析 app 的相关性：中高——手机端运行能力对攀岩 app 的 iOS/Android 部署极具价值</li></ul><ul class="notion-list notion-list-disc notion-block-502aba7ca1074d0ea39b4e4c94c8277a"><li>可迁移到项目的点：评估 MiniCPM-V 在攀岩视频描述任务上的实际质量，作为 edge 方案候选</li></ul><ul class="notion-list notion-list-disc notion-block-879efb06764a4a52bbda1e06d6618bd5"><li>优先级：中</li></ul><ul class="notion-list notion-list-disc notion-block-a5a4e917ca6844488c419ee78d593cd1"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.bentoml.com/blog/multimodal-ai-a-guide-to-open-source-vision-language-models">BentoML 开源 VLM 指南</a></li></ul><hr class="notion-hr notion-block-8481a361984448a0b1ba44c2268d2dd6"/><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-4a602227866040ed84c30dcd30be82c5" data-id="4a602227866040ed84c30dcd30be82c5"><span><div id="4a602227866040ed84c30dcd30be82c5" class="notion-header-anchor"></div><a class="notion-hash-link" href="#4a602227866040ed84c30dcd30be82c5" title="D. 产品化 / 商业化 / 行业动态"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">D. 产品化 / 商业化 / 行业动态</span></span></h3><div class="notion-text notion-block-43f74d64e8224f48971e09ec6846123a"><b>Anthropic 估值 3800 亿，年化营收 140 亿</b></div><ul class="notion-list notion-list-disc notion-block-6cc9440192fe4f0999e97e640c8e3795"><li>动态：Series G 融资 300 亿后估值 3800 亿，成全球第三大未上市公司，Claude Opus 4.6 的编程能力是核心商业护城河</li></ul><ul class="notion-list notion-list-disc notion-block-bb1e35383b4a40dcaae9444c3069d5cf"><li>背后的趋势判断：AI 公司估值正在与「真实 coding agent 能力」挂钩，而非仅凭对话质量</li></ul><ul class="notion-list notion-list-disc notion-block-ebf3d07e37264e9ca2c85612da19c937"><li>对 side project / 求职 / 项目方向的启发：技术栈选 Claude API 有充分商业背书，Anthropic 生态长期稳定</li></ul><ul class="notion-list notion-list-disc notion-block-9af419651f3448bc803019a26d1d45ca"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.mean.ceo/ai-startup-trends-march-2026/">AI startup trends March 2026</a></li></ul><div class="notion-text notion-block-dc4d47df0b02443a8aff2de75ed4d119"><b>Agentic AI 进入生产：MCP 成事实标准</b></div><ul class="notion-list notion-list-disc notion-block-a91409020f8e43a8aef1766feed8b7ec"><li>动态：2026 年 MCP 成为 agent 连接真实系统的事实标准，multi-agent 系统从 demo 进入日常工作流</li></ul><ul class="notion-list notion-list-disc notion-block-2c7417532afe44fe82c26373dc0bca58"><li>背后的趋势判断：agent 不再是「聊天助手扩展版」，而是具有持久状态、工具调用、并行执行能力的独立系统</li></ul><ul class="notion-list notion-list-disc notion-block-018fe76350f047afbcb6dec4a1ae6a47"><li>对 side project / 求职 / 项目方向的启发：攀岩 app 的 agent pipeline 应该从第一天就设计 MCP 兼容接口</li></ul><ul class="notion-list notion-list-disc notion-block-fd7073740ab848749b929c7970721f95"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://techcrunch.com/2026/01/02/in-2026-ai-will-move-from-hype-to-pragmatism/">TechCrunch: AI moves from hype to pragmatism</a></li></ul><div class="notion-text notion-block-1d877faa43a14e2a9b71fae5e54121ac"><b>生成式视频跨越商业可行性门槛</b></div><ul class="notion-list notion-list-disc notion-block-0aef946960bd470487e028dad1633c67"><li>动态：LTX-2.3 开源 Apache 2.0 + Kling 3.0 ($0.075/sec API) + Seedance 2.0 (Elo 1269) 三大发布标志视频生成可大规模商业部署</li></ul><ul class="notion-list notion-list-disc notion-block-c73bcc1a02c74e7795ae38059df48978"><li>背后的趋势判断：视频内容生产成本将在 12 个月内下降 10x；视频理解 + 生成的组合应用将是下一个 killer app 方向</li></ul><ul class="notion-list notion-list-disc notion-block-a5b23b604d9b4ab995b779d782539d75"><li>对 side project / 求职 / 项目方向的启发：攀岩 app 的「动作示范视频生成」功能现在技术上可行且成本可控</li></ul><ul class="notion-list notion-list-disc notion-block-d12e95d3713442a5b78514f4e6a30fa7"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.buildfastwithai.com/blogs/ai-models-march-2026-releases">BuildFastWithAI March 2026</a></li></ul><div class="notion-text notion-block-869dac2e9c7e4009bb1591a9247c2bce"><b>Small Language Models 成企业 AI 主流</b></div><ul class="notion-list notion-list-disc notion-block-d5db4f00f58b4fe498843c3b27b14871"><li>动态：IBM、IDC 等机构预测 2026 年 fine-tuned SLM 将成为成熟 AI 企业标配，替代 out-of-the-box LLM</li></ul><ul class="notion-list notion-list-disc notion-block-c1993a3d256f4c089f53ad1c8561a105"><li>背后的趋势判断：成本 + 隐私 + 延迟优势推动企业从通用大模型转向领域专用小模型</li></ul><ul class="notion-list notion-list-disc notion-block-5e532c8a366c4fa1b32a99f3ef430534"><li>对 side project / 求职 / 项目方向的启发：攀岩 app 的长期技术路线应包括「fine-tuned 攀岩专用小模型」</li></ul><ul class="notion-list notion-list-disc notion-block-0d2be7bcf67a4e3a92cac88ad3bd46a4"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.ibm.com/think/news/ai-tech-trends-predictions-2026">IBM AI Trends 2026</a></li></ul><hr class="notion-hr notion-block-14ea7800e5f94688a2ad03268788da01"/><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-803f819a88a04967ad19d2ea89946a29" data-id="803f819a88a04967ad19d2ea89946a29"><span><div id="803f819a88a04967ad19d2ea89946a29" class="notion-header-anchor"></div><a class="notion-hash-link" href="#803f819a88a04967ad19d2ea89946a29" title="E. 学习价值 / 求职价值"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">E. 学习价值 / 求职价值</span></span></h3><div class="notion-text notion-block-e77b41472b38454d963664c89cb1cc28"><b>SWE-Bench 变体深度解读（Verified vs Pro）</b></div><ul class="notion-list notion-list-disc notion-block-d0e1195e1e2c4fd0a549a49061e1cd2b"><li>内容：SWE-Bench Verified 是标准版（Claude 80.8% 胜出）；SWE-Bench Pro 是抗数据污染困难版（GPT-5.4 57.7% 胜出）。理解差异是面试谈 benchmark 的必备知识</li></ul><ul class="notion-list notion-list-disc notion-block-29ad73887ff840f598cd191249d12746"><li>适合我怎么用：面试表达——解释为什么「benchmark 第一」不等于「实际最好」</li></ul><ul class="notion-list notion-list-disc notion-block-edcc98761025467a9a66e717ed804293"><li>推荐动作：把两个 benchmark 的设计差异写成一段话，背熟，面试直接用</li></ul><ul class="notion-list notion-list-disc notion-block-9284400aa8ea47609893efed46792d6a"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://evolink.ai">evolink.ai</a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://evolink.ai/blog/swe-bench-verified-2026-claude-vs-gpt"> SWE-Bench 解读</a></li></ul><div class="notion-text notion-block-18c02f9851ba46c9a654d83bcf732f4c"><b>Mobile-VideoGPT 论文复现</b></div><ul class="notion-list notion-list-disc notion-block-9d9b8cc1958a4f23a846ec783d231bce"><li>内容：0.5B 参数，3GB VRAM，开源代码，frame scoring 策略是核心创新点</li></ul><ul class="notion-list notion-list-disc notion-block-9e21d0b8dcc1419eb3162bbd967a54fb"><li>适合我怎么用：复现——将攀岩视频喂入 Mobile-VideoGPT，评估动作描述质量，写进项目 portfolio</li></ul><ul class="notion-list notion-list-disc notion-block-77824f28aeaa4ee49238933eb4dda228"><li>推荐动作：fork GitHub repo → 测试 3 段攀岩视频 → 记录输出质量 → 写成 blog</li></ul><ul class="notion-list notion-list-disc notion-block-0139584654b845f09f0b807d8b3cfd71"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/amshaker/mobile-videogpt">GitHub</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2503.21782v1">arXiv</a></li></ul><div class="notion-text notion-block-7dc7b8344a904192965635cecfd02d00"><b>Cursor 并行 Agent 实操</b></div><ul class="notion-list notion-list-disc notion-block-076ffb3893d6485aac3a5a141b373886"><li>内容：8 个并行 cloud agent，独立 VM，30 秒完成多数任务，BugBot 自动修复 PR</li></ul><ul class="notion-list notion-list-disc notion-block-e59f002bf9c04b1f889d99dcbc7adfc9"><li>适合我怎么用：试用——实际跑一个并行 agent 任务，记录效率提升</li></ul><ul class="notion-list notion-list-disc notion-block-4c63aef173ed419c87f397b5527f93b9"><li>推荐动作：用并行 agent 同时开发攀岩 app 的「视频上传」和「姿态估计」两个模块</li></ul><ul class="notion-list notion-list-disc notion-block-b8c2076a301445448e826c09c6a42bda"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://aitoolanalysis.com/cursor-ai-review/">Cursor AI Review 2026</a></li></ul><div class="notion-text notion-block-1b77a54fa2fb4416916806a15e6e8d9d"><b>&quot;The Way Up&quot; 攀岩数据集</b></div><ul class="notion-list notion-list-disc notion-block-8520f4e2d9174d5c84049c77cb01ad11"><li>内容：22 个攀岩标注视频，hold 使用检测，2D 姿态估计 baseline</li></ul><ul class="notion-list notion-list-disc notion-block-3cd9095113db45689a0c811c71dfe0c4"><li>适合我怎么用：复现——直接作为攀岩 app 的核心训练数据起点</li></ul><ul class="notion-list notion-list-disc notion-block-4b2843482740486cb9c20e4fef7d5e05"><li>推荐动作：下载数据集 → 复现 hold 检测 baseline → 在 LinkedIn 发布 project update</li></ul><ul class="notion-list notion-list-disc notion-block-b09fffd68b4a4579a763af40b8f70d32"><li>链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/html/2505.12854v1">arXiv</a></li></ul><hr class="notion-hr notion-block-1f7bea0cc4584460a9cff4617439f96e"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-d33893376e91469b8ccae5ed085ded78" data-id="d33893376e91469b8ccae5ed085ded78"><span><div id="d33893376e91469b8ccae5ed085ded78" class="notion-header-anchor"></div><a class="notion-hash-link" href="#d33893376e91469b8ccae5ed085ded78" title="三、今日高分 GitHub Repo（固定栏目）"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">三、今日高分 GitHub Repo（固定栏目）</span></span></h2><div class="notion-text notion-block-37a7153e51bf4e38a2e925a1ecae5cf6"><b>1. amshaker/mobile-videogpt</b></div><ul class="notion-list notion-list-disc notion-block-cae991fc0c5c48c0a3edd0c242379e0c"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/amshaker/mobile-videogpt">https://github.com/amshaker/mobile-videogpt</a></li></ul><ul class="notion-list notion-list-disc notion-block-557e123ad004477797eea93366ef698b"><li>方向标签：video / multimodal / deployment</li></ul><ul class="notion-list notion-list-disc notion-block-c9b4e771923e45e1a43ff3446fc939b3"><li>这项目是干什么的：0.5B 参数的轻量视频理解语言模型，3GB VRAM 可运行，46 tok/sec，专为 edge 设备设计</li></ul><ul class="notion-list notion-list-disc notion-block-ba1b22007fcf4813981779a172cf35fc"><li>为什么今天值得关注：攀岩 app 的核心技术需求——轻量视频理解——的最佳开源解决方案</li></ul><ul class="notion-list notion-list-disc notion-block-d8d8ca68fa0f4975bca86ea23fe4f790"><li>与我的相关性：极高，是攀岩 app 视频分析模块的直接候选技术</li></ul><ul class="notion-list notion-list-disc notion-block-bd4cf6905c104c958c51f7148d41813f"><li>上手成本：中</li></ul><ul class="notion-list notion-list-disc notion-block-dc4ee17745b0429099aab6efe0662683"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-6e8301f7bcbf4520ae828189b9017de5"><li>是否建议我复现：是，优先级最高</li></ul><ul class="notion-list notion-list-disc notion-block-8fed861751574fa7baeff148de531ea8"><li>一句话判断：目前最适合 edge 部署的视频理解模型，攀岩 app 的第一个技术实验对象</li></ul><div class="notion-text notion-block-b4f0c31671ef49389021914c48476fe2"><b>2. Lightricks/LTX-Video</b></div><ul class="notion-list notion-list-disc notion-block-8f1095b05754474685c2ffeac8ba7cba"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/Lightricks/LTX-Video">https://github.com/Lightricks/LTX-Video</a></li></ul><ul class="notion-list notion-list-disc notion-block-c9fbfb4d4d3142feaf312cec674d0ccf"><li>方向标签：video / multimodal / app</li></ul><ul class="notion-list notion-list-disc notion-block-a3c9fc84e801428b9365980c57e12a4d"><li>这项目是干什么的：22B 参数开源视频生成模型，4K@50FPS，同步音频，Apache 2.0 商用</li></ul><ul class="notion-list notion-list-disc notion-block-1817fedbec6b4930be1634dda5512274"><li>为什么今天值得关注：「开源视频生成」首次达到商业可部署质量，本地运行无 API 费用</li></ul><ul class="notion-list notion-list-disc notion-block-063c1feaaaeb4f31aaa491dc680d0a0b"><li>与我的相关性：高——生成标准动作示范视频 + 合成训练数据</li></ul><ul class="notion-list notion-list-disc notion-block-9811ec4948cb4517859ca72270d83fc8"><li>上手成本：中（需要较好 GPU）</li></ul><ul class="notion-list notion-list-disc notion-block-cebebc610a3a4961a45c3cb0810a9cae"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-0b20ba3b24684a398c7d562cf14cd92f"><li>是否建议我复现：中期目标</li></ul><ul class="notion-list notion-list-disc notion-block-5cdbc10112f142d184ac2fa93ba40da7"><li>一句话判断：开源视频生成的质量天花板，攀岩 app 数据增强的重要工具</li></ul><div class="notion-text notion-block-7a82919a15d94e21973cce86e45253c2"><b>3. ai-dynamo/dynamo</b></div><ul class="notion-list notion-list-disc notion-block-2919b0fc91f0403b8e02fed09aad09ab"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/ai-dynamo/dynamo">https://github.com/ai-dynamo/dynamo</a></li></ul><ul class="notion-list notion-list-disc notion-block-b66ef0c3c94a4ed68e07d53e150297e6"><li>方向标签：infra / deployment</li></ul><ul class="notion-list notion-list-disc notion-block-ac1c921a1de94c9a9ce671c593309c7f"><li>这项目是干什么的：NVIDIA 开源推断 OS，Blackwell GPU 上 7x 性能提升，生产级推断优化工具链</li></ul><ul class="notion-list notion-list-disc notion-block-a93d43acca664898923f07aa283ec43a"><li>为什么今天值得关注：NVIDIA GTC 2026 重点发布，「推断时代」的核心基础设施</li></ul><ul class="notion-list notion-list-disc notion-block-d194b51330d04851ba0e919404118a7d"><li>与我的相关性：中——了解推断优化架构对面试和工程理解有价值</li></ul><ul class="notion-list notion-list-disc notion-block-ee0301e759b24c748eddb589d5138fd3"><li>上手成本：高（需要 Blackwell GPU，建议阅读架构文档为主）</li></ul><ul class="notion-list notion-list-disc notion-block-3db7540d823f4cbabafe1a0d6af3ac0c"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-c8732d1152d64c7c9a6c22d078cb0b9d"><li>是否建议我复现：看文档学原理，暂不复现</li></ul><ul class="notion-list notion-list-disc notion-block-08b9e092b9244ff18df4017cf5eceaf4"><li>一句话判断：inference OS 的第一手参考资料，学原理比跑代码更重要</li></ul><div class="notion-text notion-block-2fa10b470b9a48d3a409ce8165e78a4d"><b>4. nvidia/Nemotron-3-Super-120B（HuggingFace）</b></div><ul class="notion-list notion-list-disc notion-block-a4161d69813e4877ba09f47520b3bd6d"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/nvidia/NVIDIA-Nemotron-3-Super-120B-A12B-BF16">https://huggingface.co/nvidia/NVIDIA-Nemotron-3-Super-120B-A12B-BF16</a></li></ul><ul class="notion-list notion-list-disc notion-block-a4bad414b4ab462ea037e2c25ee1d4d8"><li>方向标签：agent / infra</li></ul><ul class="notion-list notion-list-disc notion-block-b67097374d674ad88255b3d20758fcbd"><li>这项目是干什么的：NVIDIA 开源的 120B/12B active 混合 MoE 模型，SWE-Bench 开源第一，可私有部署</li></ul><ul class="notion-list notion-list-disc notion-block-cd456920e2684c7daae5c8dc07b18159"><li>为什么今天值得关注：开源权重可商用，OpenRouter 免费试用入口开放</li></ul><ul class="notion-list notion-list-disc notion-block-c4cbef4d86b54236bc095f4faec9194d"><li>与我的相关性：中——私有部署的编程辅助工具，隐私有要求时的 Claude 替代方案</li></ul><ul class="notion-list notion-list-disc notion-block-78a8f69ed36141c2b7f4cbc42d26111e"><li>上手成本：中（通过 OpenRouter API 低门槛）</li></ul><ul class="notion-list notion-list-disc notion-block-c2a241003f3041c894853a5640a3dd1b"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-01c60cc2bd2846478f34d95f54d1f81b"><li>是否建议我复现：通过 API 试用即可</li></ul><ul class="notion-list notion-list-disc notion-block-31cf1c578cdf4ac38003f302a730b8b4"><li>一句话判断：开源 coding 模型新标杆，OpenRouter 免费入口值得立即试用</li></ul><div class="notion-text notion-block-22c82c71ac734e66ad0c6dd2d4f6cf12"><b>5. n8n-io/n8n（150k stars）</b></div><ul class="notion-list notion-list-disc notion-block-d30bc4323ae241f7a271f590921b33f6"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/n8n-io/n8n">https://github.com/n8n-io/n8n</a></li></ul><ul class="notion-list notion-list-disc notion-block-b69dfc72ab944b50ab7acd0e7b3dd21c"><li>方向标签：agent / app</li></ul><ul class="notion-list notion-list-disc notion-block-31e07d2d863b4a19b77834e9194adf96"><li>这项目是干什么的：开源工作流自动化平台，可视化 + 代码双模式，原生 AI 能力，150k stars</li></ul><ul class="notion-list notion-list-disc notion-block-45282b8bd6d248768f354834690e9436"><li>为什么今天值得关注：2026 年 agent 自动化的主流低代码工具，对快速搭建 MVP 有价值</li></ul><ul class="notion-list notion-list-disc notion-block-6d65fa6444e148aca52fc89db2282ec3"><li>与我的相关性：中——可用于攀岩 app 后端自动化（视频上传触发分析 pipeline）</li></ul><ul class="notion-list notion-list-disc notion-block-1d45f933ff944b66af5f9139fa7b069b"><li>上手成本：低</li></ul><ul class="notion-list notion-list-disc notion-block-9b6d2525052a47868df7bdadd7ffb467"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-9fa4e0a8dfef46998f99eef75275a4dc"><li>是否建议我复现：可用于攀岩 app 的快速 MVP</li></ul><ul class="notion-list notion-list-disc notion-block-b57747108c4440a48b1fefa748e68a40"><li>一句话判断：agent workflow 的低门槛起点，比从零写 LangGraph 快 10 倍搭原型</li></ul><div class="notion-text notion-block-75c8f49e69e14c4abd993636be572a10"><b>6. confident-ai/deepeval</b></div><ul class="notion-list notion-list-disc notion-block-eb40feb322b741b3a28cf33aba79839f"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/confident-ai/deepeval">https://github.com/confident-ai/deepeval</a></li></ul><ul class="notion-list notion-list-disc notion-block-bb16e97f0bd141e295a6941824de57a5"><li>方向标签：eval / agent</li></ul><ul class="notion-list notion-list-disc notion-block-001fb86327814a3f84edfd2811a5a1a9"><li>这项目是干什么的：LLM 应用测试和评估框架，50+ 评估指标，pytest 原生集成，Apache 2.0</li></ul><ul class="notion-list notion-list-disc notion-block-4957776b8e254a7ba564961d0f737798"><li>为什么今天值得关注：LLM eval 赛道获 Braintrust 8000 万融资关注，DeepEval 是最完整的开源替代</li></ul><ul class="notion-list notion-list-disc notion-block-7b0af3bd74ea4d83b62e1b697c471090"><li>与我的相关性：高——攀岩动作分析的 LLM 输出质量必须有评估机制</li></ul><ul class="notion-list notion-list-disc notion-block-f386309fb1364cbc9f52784d2cd8be32"><li>上手成本：低</li></ul><ul class="notion-list notion-list-disc notion-block-c1b78b81757e491cb327923b43c05605"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-966b0e88c8fa46de8651b12f0c2df279"><li>是否建议我复现：是，直接集成进攀岩 app</li></ul><ul class="notion-list notion-list-disc notion-block-042b5ebe1473482a9020f3068038261f"><li>一句话判断：LLM 应用质量保障的必备工具，pip install deepeval 即开始</li></ul><hr class="notion-hr notion-block-990d18c6f47f4cf1a1d190c732a544fa"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-ea2330828c5147618f38fde47b3bcf8c" data-id="ea2330828c5147618f38fde47b3bcf8c"><span><div id="ea2330828c5147618f38fde47b3bcf8c" class="notion-header-anchor"></div><a class="notion-hash-link" href="#ea2330828c5147618f38fde47b3bcf8c" title="四、今日最值得我看的 3 篇 / 3 个链接"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">四、今日最值得我看的 3 篇 / 3 个链接</span></span></h2><div class="notion-text notion-block-a05915367a644c7abf1e204ff92ed996"><b>第 1 位：Mobile-VideoGPT GitHub + arXiv</b></div><div class="notion-text notion-block-b4d01523fb5c4eadb30388fd305c2411"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/amshaker/mobile-videogpt">https://github.com/amshaker/mobile-videogpt</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2503.21782v1">https://arxiv.org/abs/2503.21782v1</a></div><div class="notion-text notion-block-b38e881c29404f76a77d6e0ebd14721a">直接命中攀岩 app 的核心技术需求。0.5B 模型、3GB VRAM、46 tok/sec——目前最接近「可以真正部署」的轻量视频理解方案。今天读完 README 和 arXiv 摘要，明天开始复现。</div><div class="notion-text notion-block-88f62fac85d14dfc88e9eb1f985a4a63"><b>第 2 位：The Way Up 攀岩数据集（arXiv 2505.12854）</b></div><div class="notion-text notion-block-e6bff9e5b69442fe8dbd57a94b97c2e3"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/html/2505.12854v1">https://arxiv.org/html/2505.12854v1</a></div><div class="notion-text notion-block-5d1e84841e8840e9a100421bbc7b0ca2">专为攀岩构建的标注数据集。22 个视频 + hold 使用标注 + 时间标签。对攀岩 app 而言，这是「不需要自己收集数据就可以开始训练」的最短路径。读完方法论，评估是否可直接作为项目起点。</div><div class="notion-text notion-block-30cf54572e73486987f5d4880da3d59f"><b>第 3 位：</b><b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://evolink.ai">evolink.ai</a></b><b> SWE-Bench Verified 深度解读</b></div><div class="notion-text notion-block-b2346f628f1d486ca52df3e735d2da37"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://evolink.ai/blog/swe-bench-verified-2026-claude-vs-gpt">https://evolink.ai/blog/swe-bench-verified-2026-claude-vs-gpt</a></div><div class="notion-text notion-block-59efa04561194a128998dd0a74efabc5">理解 benchmark 差异是 2026 年 AI 工程师的基本素养。把 SWE-Bench Verified 和 SWE-Bench Pro 的差异讲得非常清楚，是面试谈模型选型的必读材料，20 分钟读完可直接转化为面试表达。</div><hr class="notion-hr notion-block-975bea077f3f44f788588b744d5bce44"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-cd71232db04448148276be902f4e620c" data-id="cd71232db04448148276be902f4e620c"><span><div id="cd71232db04448148276be902f4e620c" class="notion-header-anchor"></div><a class="notion-hash-link" href="#cd71232db04448148276be902f4e620c" title="五、今日行动清单"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">五、今日行动清单</span></span></h2><div class="notion-text notion-block-b0dd90ae294241bf9c449f36fb2bfa88"><b>1. 今天值得收藏但不必立刻看的</b></div><ul class="notion-list notion-list-disc notion-block-7a83958f9bf64c5ebb8fd0850749aa6d"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blogs.nvidia.com/blog/nemotron-3-super-agentic-ai/">NVIDIA Dynamo 架构文档</a> — inference 优化方向深入时再读</li></ul><ul class="notion-list notion-list-disc notion-block-9cd2e4e7e7e74b1cbc4dc10748b04500"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/Lightricks/LTX-Video">LTX-2.3 详细技术文档</a> — 需要合成训练数据时再看</li></ul><ul class="notion-list notion-list-disc notion-block-4eb19cb488724ab390fe8b5ce859678a"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://belay.ai/">Belay AI 产品体验</a> — 了解竞品，规划差异化</li></ul><ul class="notion-list notion-list-disc notion-block-1bf9096fb3b444459a1a66a357df0e47"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.ibm.com/think/news/ai-tech-trends-predictions-2026">IBM AI Trends 2026 报告</a> — SLM 方向深度资料</li></ul><div class="notion-text notion-block-b0fc244d91984f739ded9881d3af3846"><b>2. 今天值得精读的</b></div><ul class="notion-list notion-list-disc notion-block-422224a080824738ba7cc6cb27c74ddd"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2503.21782v1">Mobile-VideoGPT arXiv</a> — 今天读 abstract + method，评估复现可行性</li></ul><ul class="notion-list notion-list-disc notion-block-2e675f44f6ce46eda8748b47d0f6efce"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://evolink.ai">evolink.ai</a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://evolink.ai/blog/swe-bench-verified-2026-claude-vs-gpt"> SWE-Bench 解读</a> — 20 分钟，直接转化为面试内容</li></ul><ul class="notion-list notion-list-disc notion-block-d7c156d3c7804c159514a7eb299a1c26"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/html/2505.12854v1">The Way Up 攀岩数据集</a> — 评估作为项目数据起点的可行性</li></ul><div class="notion-text notion-block-a088e4c62db04f999d8ebb9b00d33729"><b>3. 今天值得复现/试用的</b></div><ul class="notion-list notion-list-disc notion-block-f94886f13f2041efb614e5d8f3ea606e"><li>Mobile-VideoGPT：git clone → 用 3 段攀岩视频测试 → 记录描述质量</li></ul><ul class="notion-list notion-list-disc notion-block-bc467c321b5a495fa0f57d82bc0d1e5d"><li>Nemotron 3 Super via OpenRouter：免费 API 试用编程能力，和 Claude 4.6 做横向对比</li></ul><ul class="notion-list notion-list-disc notion-block-7e572b7c8f934924830c419e10cfae3d"><li>DeepEval：pip install deepeval → 设计一个攀岩动作描述质量的 eval 测试用例</li></ul><div class="notion-text notion-block-985e729468274482b64857c939a463c1"><b>4. 今天值得记到项目 roadmap 的</b></div><ul class="notion-list notion-list-disc notion-block-ce8e8091a4804c05939adb02804a4df6"><li>视频分析核心模型：将 Mobile-VideoGPT 纳入技术选型候选，对比 Gemini Flash-Lite API 成本</li></ul><ul class="notion-list notion-list-disc notion-block-440525223c984f70b6e1beb14b86d341"><li>数据策略：引入 &quot;The Way Up&quot; 数据集作为 hold 检测模块的起点训练数据</li></ul><ul class="notion-list notion-list-disc notion-block-c94bb58dbdab4f6abb745c52379d3bfe"><li>数据增强：用 LTX-2.3 生成合成标准动作视频（中期目标）</li></ul><ul class="notion-list notion-list-disc notion-block-14266a9299054e278745fa23cba33765"><li>Eval 机制：用 DeepEval 建立动作分析输出的质量评估 pipeline</li></ul><ul class="notion-list notion-list-disc notion-block-837415360ac54dcabcd7ae86af94a36d"><li>竞品研究：整理 Belay AI / AscentAI 的功能矩阵，明确差异化方向</li></ul><div class="notion-text notion-block-0c856a6e7a874654a5394697021f4ab1"><b>5. 今天面试里可以拿来讲的 1~2 个点</b></div><div class="notion-text notion-block-14b21b671f8b40eca704665866e1a2d7">① Benchmark 鉴别能力：&quot;Claude Opus 4.6 在 SWE-Bench Verified 以 80.8% 领先 GPT-5.4 的 77.2%，但在抗数据污染设计的 SWE-Bench Pro 上 GPT-5.4 反超。这说明单一 benchmark 不足以做模型选型，我在攀岩 app 中用任务专属的 eval（DeepEval + 自定义指标）来评估模型在实际动作描述任务上的表现。&quot;</div><div class="notion-text notion-block-267c4b46c20646cc823403300c2794a1">② 轻量化视频理解：&quot;我正在研究 Mobile-VideoGPT，一个 0.5B 参数、3GB VRAM 的视频理解模型，比同规模模型快 2x+。攀岩分析需要实时处理，这种 edge-deployable 的方案比调用云端 API 在延迟和隐私上都有明显优势。&quot;</div><hr class="notion-hr notion-block-43b92ec6fbfe404a924d41df8072144b"/><div class="notion-text notion-block-b0ea167091c240cd978e7c921c1f8b2b"><em>🤖 AI 日报由 Claude 自动生成 | 数据截至 2026-03-27 | 如有遗漏或错误欢迎反馈</em></div></main></div>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[AI 日报 | 2026-03-26]]></title>
            <link>https://dundun0504.com/article/ai-daily-2026-03-26</link>
            <guid>https://dundun0504.com/article/ai-daily-2026-03-26</guid>
            <pubDate>Thu, 26 Mar 2026 00:00:00 GMT</pubDate>
            <description><![CDATA[2026-03-26 AI 日报：Mistral 开源 TTS 模型 Voxtral、Google DeepMind 牵手 Agile Robots、NVIDIA 推断时代到来、agent 框架格局成型、视频生成进入 4K 实时时代。]]></description>
            <content:encoded><![CDATA[<div id="notion-article" class="mx-auto overflow-hidden "><main class="notion light-mode notion-page notion-block-32f670e5549981049769fd4915933dc8"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><div class="notion-text notion-block-f139d6485803426e842538ae6081acc4"><b>① Mistral 发布开源 TTS 模型 Voxtral（2026-03-26）</b></div><div class="notion-text notion-block-d00f86a422a8440183eac787c8deb63b">Mistral 发布支持 9 种语言的开源 TTS 模型，叠加本周 Mistral Small 4（119B MoE，整合 reasoning + multimodal + agentic）和 Mistral 3 系列（14B/8B/3B）。开源阵营多模态能力栈快速拼图，speech/vision/reasoning 开始收敛进同一家开源模型家族。</div><div class="notion-text notion-block-4eb46bd414e84b2fb96f70d25d4834e0">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://techcrunch.com/2026/03/26/mistral-releases-a-new-open-source-model-for-speech-generation/">TechCrunch</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://mistral.ai/news/mistral-3">Mistral 官方</a></div><div class="notion-text notion-block-8518c2550eb54a54a683b80080b7b6f2"><b>② Google DeepMind × Agile Robots：Gemini 进入工厂机器人（2026-03-24）</b></div><div class="notion-text notion-block-2634cc56cd424edda99d5b335d603a96">Googlee DeepMind 宣布与慕尼黑 Agile Robots 合作，将 Gemini 基础模型集成到工业机器人，构建真实操作数据反馈闭环。这是 VLA（视觉-语言-动作）模型最清晰的工业落地信号，人形/工业机器人方向的项目选题值得关注。</div><div class="notion-text notion-block-d301032a89ac44cda0e39df313290b9b">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://techcrunch.com/2026/03/24/google-partners-with-agile-robots-growing-its-ai-robotics-footprint/">TechCrunch</a></div><div class="notion-text notion-block-0fb9e98d8a3248f5a7c3f2b7e676efde"><b>③ NVIDIA CEO：推断时代的拐点已到来（2026-03-26）</b></div><div class="notion-text notion-block-f37a2361cce449a6a0d67472424835ec">GTC 2026 宣布 Dynamo 1.0（开源推断 OS，Blackwell 提速 7x）和 Vera Rubin 平台。算力重心明确从训练转向推断。Inference optimization 将成为 1-2 年内最热的工程方向，面试谈 KV cache/speculative decoding 含金量在升。</div><div class="notion-text notion-block-c0e84b17db5241fb9e6348ce6aef565e">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blogs.nvidia.com/blog/gtc-2026-news/">NVIDIA GTC</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.fool.com/investing/2026/03/26/nvidia-says-the-inflection-point-of-inference-has/">Motley Fool</a></div><div class="notion-text notion-block-9acc5f4c6733469e9ff8b509c2bd5ec9"><b>④ Replit 4 亿美元 D 轮，估值 90 亿；OpenClaw GitHub 历史最快增长</b></div><div class="notion-text notion-block-b3efb121bc3c4128a9a95ab646a4fa24">Replit 估值半年从 30 亿涨至 90 亿，定位 agentic AI 软件开发。OpenClaw 从 9,000 stars 飙至 21 万，是 GitHub 历史上增长最快的开源项目。市场大力押注 coding agent + agentic dev environment 方向。</div><div class="notion-text notion-block-242c303c9a1841d9aaa157197d7a5e7f">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.mean.ceo/ai-startup-funding-news-march-2026/">Crunchbase</a></div><div class="notion-text notion-block-b7e244df3b9948f8ab001e36c5f7e23d"><b>⑤ 视频生成进入 4K 实时时代（Kling 3.0 / Seedance 2.0 / Helios 同期发布）</b></div><div class="notion-text notion-block-4564e7f8884240e7a8334db1dbc0da0e">Kling 3.0（4K 60fps，API $0.075/sec）、ByteDance Helios（单卡实时 60 秒）、Seedance 2.0（Elo 1,269 全球第一）同期发布。视频生成已跨越「可演示」进入「可部署」阶段。对攀岩 app 价值：生成合成训练视频、可视化动作改进建议。</div><div class="notion-text notion-block-9e4a6662588e4ad5b3292934102781c5">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.buildfastwithai.com/blogs/ai-models-march-2026-releases">BuildFastWithAI</a></div><hr class="notion-hr notion-block-18617fcea9eb44b1a6ae76f66610251d"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-e152f573347146689bb073265e07d4bf" data-id="e152f573347146689bb073265e07d4bf"><span><div id="e152f573347146689bb073265e07d4bf" class="notion-header-anchor"></div><a class="notion-hash-link" href="#e152f573347146689bb073265e07d4bf" title="二、按我的目标分类"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">二、按我的目标分类</span></span></h2><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-c4d38f356b4b4d8b888d4612057bb680" data-id="c4d38f356b4b4d8b888d4612057bb680"><span><div id="c4d38f356b4b4d8b888d4612057bb680" class="notion-header-anchor"></div><a class="notion-hash-link" href="#c4d38f356b4b4d8b888d4612057bb680" title="A. 前沿模型 / 一手发布"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">A. 前沿模型 / 一手发布</span></span></h3><div class="notion-text notion-block-aca78ffdcd684838af398a9268172d93"><b>Mistral 3 系列 + Small 4</b></div><ul class="notion-list notion-list-disc notion-block-c99c6616b95b418aa754af124f59822f"><li><b>事件</b>：发布 Mistral 3（14B/8B/3B 稠密模型）+ Mistral Large 3（41B active / 675B total MoE）+ Mistral Small 4（119B，整合 reasoning + multimodal + agentic，128 experts）</li></ul><ul class="notion-list notion-list-disc notion-block-cffd9561ad384a40ae263cae53ce13d8"><li><b>核心内容</b>：Small 4 是首个统一 Magistral/Pixtral/Devstral 能力的单一模型</li></ul><ul class="notion-list notion-list-disc notion-block-799e129aa40a479aaa30c6be8e44c606"><li><b>为什么重要</b>：开源阵营正在追平闭源能力边界；Small 4 的统一架构是未来 edge 部署的重要参考</li></ul><ul class="notion-list notion-list-disc notion-block-655c11410bbe4f40a0ced24cb4764f97"><li><b>我需不需要点开</b>：需要，尤其是 Small 4 的 agentic 能力和 benchmark 表现</li></ul><ul class="notion-list notion-list-disc notion-block-96da0d49663c480f84b9f91e7956632d"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://mistral.ai/news/mistral-3">Mistral 官方</a></li></ul><div class="notion-text notion-block-8b62f41af9f14410a8ac731248e64efb"><b>Qwen 3.5 系列</b></div><ul class="notion-list notion-list-disc notion-block-413cc592760a4b28a5d0b0686e0f6109"><li><b>事件</b>：Qwen 3.5 9B 在 GPQA Diamond 得 81.7，Video-MME 得 84.5（对比 Gemini 2.5 Flash-Lite 的 74.6）</li></ul><ul class="notion-list notion-list-disc notion-block-b4f18c67baa74727a7d68c0ccd03dddd"><li><b>核心内容</b>：9B 模型超越 13x 规模对手，原生支持文本/图像/视频，无需独立视觉适配器</li></ul><ul class="notion-list notion-list-disc notion-block-0f3a515cbba94f39b86c05cab5c91a57"><li><b>为什么重要</b>：轻量模型达到顶级视频理解能力，对攀岩 app 的 mobile/edge 部署方向极其相关</li></ul><ul class="notion-list notion-list-disc notion-block-01a79496e95745ce94f8e8c5ec54ae60"><li><b>我需不需要点开</b>：需要，重点看视频理解能力部分</li></ul><ul class="notion-list notion-list-disc notion-block-6f0e423d52464cc2adf1b7b5f0e5f1e2"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.buildfastwithai.com/blogs/ai-models-march-2026-releases">BuildFastWithAI</a></li></ul><div class="notion-text notion-block-f063d6c4d8104639a51413bdb6c8cd23"><b>Gemini 3.1 系列</b></div><ul class="notion-list notion-list-disc notion-block-da96318fff04414590992367440ff7fb"><li><b>事件</b>：Gemini 3.1 Pro（1M context，ARC-AGI-2 77.1%）+ Flash-Lite（2.5x 更快，$0.25/M tokens）</li></ul><ul class="notion-list notion-list-disc notion-block-767df87daee247e98a553d4446da60da"><li><b>核心内容</b>：多模态全覆盖，Flash-Lite 成本极低</li></ul><ul class="notion-list notion-list-disc notion-block-a283731a814f46708ccafed6cd51b226"><li><b>为什么重要</b>：Flash-Lite 是 API 调用视频理解的高性价比选择，值得纳入攀岩 app 的模型选型</li></ul><ul class="notion-list notion-list-disc notion-block-2338b6b1b36b414299908740867193a1"><li><b>我需不需要点开</b>：中等，关注 Flash-Lite 的视频 token 价格和长度限制</li></ul><ul class="notion-list notion-list-disc notion-block-aed8af68468b4f389bc970e39cf5dc70"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://llm-stats.com/ai-news">LLM Stats</a></li></ul><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-1a15b5e8b3f44f0a9e697949d611d095" data-id="1a15b5e8b3f44f0a9e697949d611d095"><span><div id="1a15b5e8b3f44f0a9e697949d611d095" class="notion-header-anchor"></div><a class="notion-hash-link" href="#1a15b5e8b3f44f0a9e697949d611d095" title="B. AI 工程 / Agent / Coding Workflow"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">B. AI 工程 / Agent / Coding Workflow</span></span></h3><div class="notion-text notion-block-d1605d396c9c4c979dc30fc3a434427f"><b>Claude Code vs Cursor 深度对比</b></div><ul class="notion-list notion-list-disc notion-block-eaf83425e8c94e0ba0f02558069ca422"><li><b>内容</b>：Claude Code 以 1M token context + computer use 为核心优势；Cursor 以 tab 补全和 IDE 集成取胜。Claude Code 完成相同任务消耗 token 比 Cursor 少 5.5x</li></ul><ul class="notion-list notion-list-disc notion-block-6e2a7d4e624f4e0bacc1656963c67d6f"><li><b>可落地价值</b>：混合使用策略——Cursor 做日常迭代，Claude Code 做大型功能和全库重构</li></ul><ul class="notion-list notion-list-disc notion-block-474c78b3ea9e48769efbab7f8d5945ce"><li><b>对我的意义</b>：当前最值得掌握的 coding workflow 模式，面试中可演示实际效率提升</li></ul><ul class="notion-list notion-list-disc notion-block-5d15c97aa24e496a951a88df0fc6ede7"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://Emergent.sh">Emergent.sh</a></li></ul><div class="notion-text notion-block-8d4c7a1439f94b70a33659324c00d0f3"><b>LangGraph 34.5M 月下载量，agent 框架格局已定</b></div><ul class="notion-list notion-list-disc notion-block-2ebd2de432c948c880d6c56197931c4e"><li><b>内容</b>：LangGraph 成为最受生产验证的 agent 框架（Klarna/Uber/LinkedIn 使用），CrewAI 次之</li></ul><ul class="notion-list notion-list-disc notion-block-e22bb6ab2ba843b8a9450db29b4854a0"><li><b>可落地价值</b>：LangGraph 值得作为 agent 项目的首选框架</li></ul><ul class="notion-list notion-list-disc notion-block-5622c126ba58407b8088ddaa1f5f7c6b"><li><b>对我的意义</b>：攀岩 app 的「视频上传→分析→建议生成」流程完全可以用 LangGraph 搭建 agent pipeline</li></ul><ul class="notion-list notion-list-disc notion-block-627d60fb227945e9964534c10cf74db0"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/langchain-ai/langgraph">LangGraph GitHub</a></li></ul><div class="notion-text notion-block-9fbfcdc3731943248ab449e09e5aef0a"><b>Braintrust 融资 8000 万美元，LLM eval 赛道升温</b></div><ul class="notion-list notion-list-disc notion-block-097ef2d63ab64909b604ccfc9d05e680"><li><b>内容</b>：Braintrust 估值 8 亿。DeepEval（Apache-2.0，50+ 指标）仍是最完整的开源 eval 工具</li></ul><ul class="notion-list notion-list-disc notion-block-f90b9eab91644afaa114f8f982c54d06"><li><b>可落地价值</b>：构建 AI 应用必须配套 eval；DeepEval 可直接集成进 pytest</li></ul><ul class="notion-list notion-list-disc notion-block-8e77112fbe4a4905b110aca1a85a9f2a"><li><b>对我的意义</b>：攀岩 app 的动作分析质量评估需要 eval 框架，DeepEval 是起手首选</li></ul><ul class="notion-list notion-list-disc notion-block-1d1f67abfb2c496a87b6552b0755c5a7"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/confident-ai/deepeval">DeepEval GitHub</a></li></ul><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-636a0ffcc63d405881549afa575ee45f" data-id="636a0ffcc63d405881549afa575ee45f"><span><div id="636a0ffcc63d405881549afa575ee45f" class="notion-header-anchor"></div><a class="notion-hash-link" href="#636a0ffcc63d405881549afa575ee45f" title="C. 视觉 / 视频 / 运动人体分析"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">C. 视觉 / 视频 / 运动人体分析</span></span></h3><div class="notion-text notion-block-6684ec5ba9df4c65b2a422013ab0affd"><b>Qwen 3.5 9B 的视频理解能力</b></div><ul class="notion-list notion-list-disc notion-block-216c1c4650bc4231801cf73a8e125d92"><li><b>内容</b>：Video-MME 得分 84.5（含字幕），无需独立视觉适配器，原生 multimodal</li></ul><ul class="notion-list notion-list-disc notion-block-e088daf1bd294adaa592c8a3ff8095af"><li><b>与攀岩动作分析 app 的相关性</b>：高——9B 模型有可能在 edge 设备上运行，直接处理攀岩视频</li></ul><ul class="notion-list notion-list-disc notion-block-4cb3e77cdc0142269f08c1d5f10a4df7"><li><b>可迁移到项目的点</b>：用 Qwen 3.5 替代更大模型做视频描述 + 动作识别，降低推断成本</li></ul><ul class="notion-list notion-list-disc notion-block-39b7f24bddfb4517acc0043fb0999a85"><li><b>优先级</b>：高</li></ul><ul class="notion-list notion-list-disc notion-block-b3ae5c08f88f474b8c6880e7aa0a82cc"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.buildfastwithai.com/blogs/ai-models-march-2026-releases">BuildFastWithAI</a></li></ul><div class="notion-text notion-block-0842596a147b45d2b8c0583b85b542d3"><b>3D 姿态估计工业应用新论文（M-PCT + DGST）</b></div><ul class="notion-list notion-list-disc notion-block-17d977969088489c9e28d492aebd6f91"><li><b>内容</b>：Multi-scale Pose as Compositional Tokens + Distance-Gated Spatiotemporal Transformer，针对复杂工业场景的 3D 姿态重建</li></ul><ul class="notion-list notion-list-disc notion-block-de6cefd4a10f45c18774f838bc96d03d"><li><b>与攀岩动作分析 app 的相关性</b>：中高——攀岩场景遮挡多、视角复杂，DGST 的时空建模方法可迁移</li></ul><ul class="notion-list notion-list-disc notion-block-58e2e89af62e4680a743be064902c3a4"><li><b>可迁移到项目的点</b>：M-PCT 的多尺度 token 方法可改善攀岩身体部位表示精度</li></ul><ul class="notion-list notion-list-disc notion-block-236aad8f1c894372a2e22108dd3e96dd"><li><b>优先级</b>：中</li></ul><ul class="notion-list notion-list-disc notion-block-16d84e32f0984a56aa9b50116b4e9533"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.sciencedirect.com/science/article/abs/pii/S014193822500335X">ScienceDirect</a></li></ul><div class="notion-text notion-block-7bce359d6c35413899ed6fd43da0cd5c"><b>视频生成 4K 实时化（Kling 3.0 / Helios / Seedance 2.0）</b></div><ul class="notion-list notion-list-disc notion-block-291f69f277d34a17bfafd4a7e0f93367"><li><b>内容</b>：Kling 3.0（API $0.075/sec）、ByteDance Helios（单卡实时 60 秒）、Seedance 2.0（Elo 1,269 全球第一）</li></ul><ul class="notion-list notion-list-disc notion-block-0febbaa9a14643f2a1c3ced3f1e2d9cb"><li><b>与攀岩动作分析 app 的相关性</b>：中——可用于生成标准动作示范视频、合成训练数据</li></ul><ul class="notion-list notion-list-disc notion-block-fb12ad6c5fa346c4969f5d4653d3fc57"><li><b>可迁移到项目的点</b>：用 Kling API 生成「理想动作」对比视频，增强用户体验</li></ul><ul class="notion-list notion-list-disc notion-block-94bc8d1cef9c45378a61d357056c096c"><li><b>优先级</b>：中</li></ul><ul class="notion-list notion-list-disc notion-block-fc3319ab0b834aa9889b819b0acf4aff"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.buildfastwithai.com/blogs/ai-models-march-2026-releases">BuildFastWithAI</a></li></ul><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-d3b4c06755a94c55a5d1845190a7afa9" data-id="d3b4c06755a94c55a5d1845190a7afa9"><span><div id="d3b4c06755a94c55a5d1845190a7afa9" class="notion-header-anchor"></div><a class="notion-hash-link" href="#d3b4c06755a94c55a5d1845190a7afa9" title="D. 产品化 / 商业化 / 行业动态"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">D. 产品化 / 商业化 / 行业动态</span></span></h3><div class="notion-text notion-block-05c3b9c6859741a8a68c7d7cbfdb4b23"><b>从对话 AI 到 Agentic AI 的明确转折（2026-03-23/24）</b></div><ul class="notion-list notion-list-disc notion-block-87225ffaaa104c0790268c7269162044"><li><b>动态</b>：业内普遍认定 2026 年 3 月下旬是「从对话助手转向自主 agent 系统时代」的分水岭</li></ul><ul class="notion-list notion-list-disc notion-block-03b09f9590c7438582aab1c7b0483b20"><li><b>背后的趋势判断</b>：agent 不再是 demo，开始进入真实工作流；LangGraph/CrewAI 等框架成为标配工程工具</li></ul><ul class="notion-list notion-list-disc notion-block-b126c03cddfd4942a5e56295ff255ab5"><li><b>对 side project / 求职 / 项目方向的启发</b>：简历和项目里要有 agent 实际落地案例，而非只是 API 调用</li></ul><ul class="notion-list notion-list-disc notion-block-85d30dbfa65541a3b98b9f9b36534927"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://llm-stats.com/ai-news">LLM Stats</a></li></ul><div class="notion-text notion-block-d6ea76656ebb47419d7259ad4d58c40f"><b>Apple 全面重构 AI 框架（Core AI 替代 Core ML）</b></div><ul class="notion-list notion-list-disc notion-block-07743674120645e5bd4271e1255cd879"><li><b>动态</b>：WWDC 2026 前预告 Core AI Framework，3 行 Swift 代码接入 Apple Intelligence；同期 Siri 整合 Google Gemini</li></ul><ul class="notion-list notion-list-disc notion-block-df650faf6cb54664bc49b2b0a86982e9"><li><b>背后的趋势判断</b>：移动端 AI 基础设施在 2026 年迎来重写；iOS 生态 AI 应用开发门槛大幅降低</li></ul><ul class="notion-list notion-list-disc notion-block-cdabc2ac361f4a56991392abad3e137e"><li><b>对 side project / 求职 / 项目方向的启发</b>：攀岩 app iOS 版本可原生接入 Apple Intelligence，做 on-device 推断，隐私优势显著</li></ul><ul class="notion-list notion-list-disc notion-block-e0babaa6d64c45a4a5be496cd931d6f1"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://9to5mac.com/2026/03/01/apple-replacing-core-ml-with-modernized-core-ai-framework-for-ios-27/">9to5Mac</a></li></ul><div class="notion-text notion-block-38de9f4ee4f840409100a66d55e3096f"><b>人形机器人融资浪潮：单周 12 亿美元</b></div><ul class="notion-list notion-list-disc notion-block-25ea407654f4495f90f25d4d6a52a44f"><li><b>动态</b>：Mind Robotics $5 亿 + Rhoda AI $4.5 亿 + Sunday $1.65 亿 + Oxa $1.03 亿；中国控制全球人形机器人市场 90%，Unitree H2 售价低于 3 万美元</li></ul><ul class="notion-list notion-list-disc notion-block-1153b48e3392426b917743291997705c"><li><b>背后的趋势判断</b>：机器人 + VLA 模型是 2026-2027 年最大的硬件赛道</li></ul><ul class="notion-list notion-list-disc notion-block-b2e3b940fb8c44ceb903ab590654c180"><li><b>对 side project / 求职 / 项目方向的启发</b>：动作分析 + 机器人控制的交叉方向极具含金量</li></ul><ul class="notion-list notion-list-disc notion-block-64ab7f26b96349c6a2b0132c92c16f7c"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://restofworld.org/2026/china-humanoid-robots-unitree-agibot-tesla-optimus/">RestOfWorld</a></li></ul><h3 class="notion-h notion-h2 notion-h-indent-1 notion-block-a0825935fe9645a6a37eb899e2f953d8" data-id="a0825935fe9645a6a37eb899e2f953d8"><span><div id="a0825935fe9645a6a37eb899e2f953d8" class="notion-header-anchor"></div><a class="notion-hash-link" href="#a0825935fe9645a6a37eb899e2f953d8" title="E. 学习价值 / 求职价值"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">E. 学习价值 / 求职价值</span></span></h3><div class="notion-text notion-block-b2472389abd14994b3578539c76df684"><b>Inference Optimization 技术栈（KV Cache / Continuous Batching / Speculative Decoding）</b></div><ul class="notion-list notion-list-disc notion-block-26617fdf68c44d2eb4d9421931b85b1e"><li><b>内容</b>：NVIDIA Dynamo 1.0 开源了完整的推断优化 OS；inference 正成为工程重心</li></ul><ul class="notion-list notion-list-disc notion-block-2b6cd505444745f29f2949338da4f52a"><li><b>适合我怎么用</b>：精读 Dynamo 文档 + 复现 speculative decoding 原理，面试中可讲</li></ul><ul class="notion-list notion-list-disc notion-block-b5f69d6e20564769a80c6840f4e135cb"><li><b>推荐动作</b>：把 speculative decoding 做成一个 demo 或 blog，直接写进项目</li></ul><ul class="notion-list notion-list-disc notion-block-66643da10f7b4aeda30b2a670218fb3e"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blogs.nvidia.com/blog/gtc-2026-news/">NVIDIA Dynamo</a></li></ul><div class="notion-text notion-block-15bca9b82607419c96e9a13d068a5fde"><b>LangGraph Agent 开发</b></div><ul class="notion-list notion-list-disc notion-block-09f73d37efd14c358d9de8f26b7c9ba6"><li><b>内容</b>：34.5M 月下载量，Klarna/Uber/LinkedIn 生产使用，是当前最值得掌握的 agent 框架</li></ul><ul class="notion-list notion-list-disc notion-block-46f3cc09961e482b9866dc36dd7cecf0"><li><b>适合我怎么用</b>：复现一个完整的 multi-agent workflow（比如「视频上传→分析 agent→建议生成 agent」），直接用于攀岩 app</li></ul><ul class="notion-list notion-list-disc notion-block-cbedd5f9fb484a2caf36f3bc06313495"><li><b>推荐动作</b>：完成官方 LangGraph 教程并记录过程，作为 portfolio 项目</li></ul><ul class="notion-list notion-list-disc notion-block-1c88fab9728946d980745db3d2be329c"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/langchain-ai/langgraph">LangGraph GitHub</a></li></ul><div class="notion-text notion-block-5e20d9dd627644548fe7de18db7b2115"><b>DeepEval：LLM eval 框架上手</b></div><ul class="notion-list notion-list-disc notion-block-812a3b3796ec4eaeba3b3fa26e6d64bd"><li><b>内容</b>：50+ 指标，pytest 集成，Apache-2.0，是最完整的开源 eval 工具</li></ul><ul class="notion-list notion-list-disc notion-block-29361c2a0f5b4ed7b38ed58140c6b140"><li><b>适合我怎么用</b>：复现一个针对视频描述质量的 eval pipeline，直接用于攀岩 app 质量保证</li></ul><ul class="notion-list notion-list-disc notion-block-d0e9e3ce8d784a1b9d90b6b8274aa7f4"><li><b>推荐动作</b>：用 DeepEval 给攀岩动作描述的 LLM 输出打分，作为 eval 设计的 portfolio 项目</li></ul><ul class="notion-list notion-list-disc notion-block-d2070d7c278048b49b8de543d28baa85"><li><b>链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/confident-ai/deepeval">DeepEval GitHub - eval framework</a></li></ul><hr class="notion-hr notion-block-04d980b0a96d40e78541393a494623c8"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-4c65583e8e3f41e7b62e1dc932d2c255" data-id="4c65583e8e3f41e7b62e1dc932d2c255"><span><div id="4c65583e8e3f41e7b62e1dc932d2c255" class="notion-header-anchor"></div><a class="notion-hash-link" href="#4c65583e8e3f41e7b62e1dc932d2c255" title="三、今日高分 GitHub Repo"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">三、今日高分 GitHub Repo</span></span></h2><div class="notion-text notion-block-f21de76e95a0443f95f049e452764ef6"><b>1. bytedance/deer-flow</b></div><ul class="notion-list notion-list-disc notion-block-38b7e390ab6d42289a35412dcbce88ac"><li><b>GitHub 链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/bytedance/deer-flow">https://github.com/bytedance/deer-flow</a></li></ul><ul class="notion-list notion-list-disc notion-block-b104d453f52841b9889769630ddec6a2"><li><b>方向标签</b>：agent</li></ul><ul class="notion-list notion-list-disc notion-block-4cd08d0b7b1445289d66174e09b03ce1"><li><b>这项目是干什么的</b>：字节跳动开源的长程超级 agent，可完成需要多步规划和长周期执行的复杂任务</li></ul><ul class="notion-list notion-list-disc notion-block-951bc489dbd34258b264d7ab123d8a7e"><li><b>为什么今天值得关注</b>：昨日新增 2,388 stars，总计 47,780，近期增长最快的 agent 项目之一</li></ul><ul class="notion-list notion-list-disc notion-block-81c5f69e44d443c1a9a862bddff0e271"><li><b>与我的相关性</b>：高——攀岩 app 的「视频上传→分析→建议生成」流程是典型长程 agent 任务</li></ul><ul class="notion-list notion-list-disc notion-block-082411dbf20f444483a5ebda801f41da"><li><b>上手成本</b>：中</li></ul><ul class="notion-list notion-list-disc notion-block-1b6333b5946a4324aee3e7678dd020e8"><li><b>是否建议我收藏</b>：是</li></ul><ul class="notion-list notion-list-disc notion-block-e90f1a9d1b0b497aaf0d5f8b6a187714"><li><b>是否建议我复现</b>：是，先跳通 demo</li></ul><ul class="notion-list notion-list-disc notion-block-19287e8daf0e4ab0a369a977f893be9f"><li><b>一句话判断</b>：字节出品、star 爆发、架构文档完整，值得第一时间复现</li></ul><div class="notion-text notion-block-8a3c25dbd039419cacfb11e4f3f6e20d"><b>2. confident-ai/deepeval</b></div><ul class="notion-list notion-list-disc notion-block-4b923d14f1aa446388497f7cb534d217"><li><b>GitHub 链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/confident-ai/deepeval">https://github.com/confident-ai/deepeval</a></li></ul><ul class="notion-list notion-list-disc notion-block-eee068b65f494c609669aac86b65bb91"><li><b>方向标签</b>：eval</li></ul><ul class="notion-list notion-list-disc notion-block-8e58b7ea873b4440b1998b5ce568f756"><li><b>这项目是干什么的</b>：LLM 应用的测试和评估框架，50+ 评估指标，pytest 原生集成</li></ul><ul class="notion-list notion-list-disc notion-block-30b9841dc94646019f80b2b558c3312f"><li><b>为什么今天值得关注</b>：Braintrust 融资 8000 万让整个 eval 赛道被重新审视；DeepEval 是最完整的免费替代</li></ul><ul class="notion-list notion-list-disc notion-block-e27d19e7ea084f0aac6a092e470ff3b9"><li><b>与我的相关性</b>：高——攀岩 app 需要评估动作分析质量</li></ul><ul class="notion-list notion-list-disc notion-block-765574c39d0b40db806e6490346e86c2"><li><b>上手成本</b>：低</li></ul><ul class="notion-list notion-list-disc notion-block-b5c39c8e43944619a35d9e617cb8b9cb"><li><b>是否建议我收藏</b>：是</li></ul><ul class="notion-list notion-list-disc notion-block-61ce956d34db4f24a11aaf83b0c76a67"><li><b>是否建议我复现</b>：是，直接集成进项目</li></ul><ul class="notion-list notion-list-disc notion-block-533806cba55649c4a117c6b0ce6f3803"><li><b>一句话判断</b>：LLM 应用开发必备，门槛低、功能全、文档好</li></ul><div class="notion-text notion-block-924d4e70a3ae413e8aaee584718b5b37"><b>3. langchain-ai/langgraph</b></div><ul class="notion-list notion-list-disc notion-block-815c3f30eefb414d8ab0bb8a5c40a378"><li><b>GitHub 链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/langchain-ai/langgraph">https://github.com/langchain-ai/langgraph</a></li></ul><ul class="notion-list notion-list-disc notion-block-cddba08f7a2140499fbf6b92fcd19c67"><li><b>方向标签</b>：agent / infra</li></ul><ul class="notion-list notion-list-disc notion-block-09e03179b4eb4d61a3ae0aca89deafad"><li><b>这项目是干什么的</b>：生产级 agent 编排框架，支持有状态多 agent 工作流</li></ul><ul class="notion-list notion-list-disc notion-block-db7f9e687792432291865dd2e798595c"><li><b>为什么今天值得关注</b>：34.5M 月下载量，Klarna/Uber/LinkedIn 生产使用，是 agent 框架的事实标准</li></ul><ul class="notion-list notion-list-disc notion-block-52116b7958a54570b648dafdac2172df"><li><b>与我的相关性</b>：高——是攀岩 app agent pipeline 的首选框架</li></ul><ul class="notion-list notion-list-disc notion-block-64c9311df5ac4a4c9ac30c2519a99070"><li><b>上手成本</b>：中</li></ul><ul class="notion-list notion-list-disc notion-block-deffcb62691744abbf80acd960856d0b"><li><b>是否建议我收藏</b>：是</li></ul><ul class="notion-list notion-list-disc notion-block-0fc49465104c4d67b1ec451cc06e4adc"><li><b>是否建议我复现</b>：是，官方教程文档完整</li></ul><ul class="notion-list notion-list-disc notion-block-3dfd0ee2de4f4e5bb6e83dc68c37c5b0"><li><b>一句话判断</b>：当前 agent 开发的最优选，不需要犹豫</li></ul><div class="notion-text notion-block-969b658d7ff74a2eb6d3182372c779aa"><b>4. caramaschiHG/awesome-ai-agents-2026</b></div><ul class="notion-list notion-list-disc notion-block-c2be967490e14700b05670485ffba96e"><li><b>GitHub 链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/caramaschiHG/awesome-ai-agents-2026">https://github.com/caramaschiHG/awesome-ai-agents-2026</a></li></ul><ul class="notion-list notion-list-disc notion-block-fb4691a79c69413f8fd94bb330ba9c25"><li><b>方向标签</b>：agent / app</li></ul><ul class="notion-list notion-list-disc notion-block-afa52b684956437cbbe6911cd75c30f4"><li><b>这项目是干什么的</b>：2026 年 AI Agent 生态全景图，涵盖框架/工具/产品分类整理</li></ul><ul class="notion-list notion-list-disc notion-block-bb87c8a00b484b4c8211b1cabc3342e6"><li><b>为什么今天值得关注</b>：包含 OpenClaw 历史最快增长记录的分析；最新框架格局总结</li></ul><ul class="notion-list notion-list-disc notion-block-1abbb218ec334f00ad3ea2a0a40c11e1"><li><b>与我的相关性</b>：中——选型参考和技术雷达用途</li></ul><ul class="notion-list notion-list-disc notion-block-c1ec510305474d7cbd1642ebd9e87993"><li><b>上手成本</b>：低（阅读为主）</li></ul><ul class="notion-list notion-list-disc notion-block-34f83590671645aa8f7213435f9bc1da"><li><b>是否建议我收藏</b>：是</li></ul><ul class="notion-list notion-list-disc notion-block-e38fba5273a748c5ab4bfef05fe09bbf"><li><b>是否建议我复现</b>：否</li></ul><ul class="notion-list notion-list-disc notion-block-0b2277a0f82a4b598924a0b324a3b0cb"><li><b>一句话判断</b>：agent 框架选型必读清单，节省大量调研时间</li></ul><div class="notion-text notion-block-43249be2728249e0aff363637105f4d4"><b>5. infiniflow/ragflow</b></div><ul class="notion-list notion-list-disc notion-block-bed548b661334097bf8aebcaa5cf80cc"><li><b>GitHub 链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/infiniflow/ragflow">https://github.com/infiniflow/ragflow</a></li></ul><ul class="notion-list notion-list-disc notion-block-b60d71f925b14abfbe9e9a35c9b69dd8"><li><b>方向标签</b>：agent / infra</li></ul><ul class="notion-list notion-list-disc notion-block-072b442c7d5140d1ad56aceef64d6ef9"><li><b>这项目是干什么的</b>：深度文档理解 + RAG 引擎，原生支持 multimodal，正在演进为 Context Engine</li></ul><ul class="notion-list notion-list-disc notion-block-821d979f91784550af6166a759797415"><li><b>为什么今天值得关注</b>：multimodal RAG 是 2026 年 RAG 进化的核心方向</li></ul><ul class="notion-list notion-list-disc notion-block-122485de5b344ce98bdc651d9f5c1ea0"><li><b>与我的相关性</b>：中高——可用于攀岩动作知识库（文字教程 + 示范视频片段混合检索）</li></ul><ul class="notion-list notion-list-disc notion-block-455672a20a5f4512bcc593275a2336fc"><li><b>上手成本</b>：中</li></ul><ul class="notion-list notion-list-disc notion-block-fc2b04f55b6245bf9c2f8675db89a87c"><li><b>是否建议我收藏</b>：是</li></ul><ul class="notion-list notion-list-disc notion-block-ffa580ee40d046c49197504099ba9a58"><li><b>是否建议我复现</b>：中期目标</li></ul><ul class="notion-list notion-list-disc notion-block-1311079f57f547f396276fd27a3a1432"><li><b>一句话判断</b>：RAG 进化方向最清晰的开源项目，multimodal 支持领先</li></ul><div class="notion-text notion-block-111606d61f484e46a51b176b313be5c0"><b>6. ai-dynamo/dynamo</b></div><ul class="notion-list notion-list-disc notion-block-50a58dfabb8e46b6998edf8041c3b419"><li><b>GitHub 链接</b>：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/ai-dynamo/dynamo">https://github.com/ai-dynamo/dynamo</a></li></ul><ul class="notion-list notion-list-disc notion-block-7b645bd402ec4e07bab1a08345635865"><li><b>方向标签</b>：infra / deployment</li></ul><ul class="notion-list notion-list-disc notion-block-7453839e79f14025a25d4297af6d9b12"><li><b>这项目是干什么的</b>：NVIDIA 开源推断 OS，Blackwell GPU 上性能提升 7x</li></ul><ul class="notion-list notion-list-disc notion-block-cf162c46057a4a7ca03bd8d14339cf67"><li><b>为什么今天值得关注</b>：NVIDIA CEO 本周宣布「推断时代拐点到来」</li></ul><ul class="notion-list notion-list-disc notion-block-31ae06b653284b0eba3b8f8a4e33623f"><li><b>与我的相关性</b>：中——推断优化是面试高频话题，理解原理有价值</li></ul><ul class="notion-list notion-list-disc notion-block-1c59b11358a741f5bd7cd234a5849acc"><li><b>上手成本</b>：高（需要 Blackwell GPU）</li></ul><ul class="notion-list notion-list-disc notion-block-899ac993d5a94787a9e19de0bb2fb911"><li><b>是否建议我收藏</b>：是（学原理为主）</li></ul><ul class="notion-list notion-list-disc notion-block-2e0d27a3b1644471a00e04db2944fed4"><li><b>是否建议我复现</b>：待观察（硬件门槛高）</li></ul><ul class="notion-list notion-list-disc notion-block-6daacd917ca646a288451594b33cc8e1"><li><b>一句话判断</b>：了解 inference OS 架构设计的第一手资料，学原理比跑代码更重要</li></ul><hr class="notion-hr notion-block-a74df22f1ffe4c0583476161cd6e7b1c"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-9392e79b1932428097db1bc8cf5dc85b" data-id="9392e79b1932428097db1bc8cf5dc85b"><span><div id="9392e79b1932428097db1bc8cf5dc85b" class="notion-header-anchor"></div><a class="notion-hash-link" href="#9392e79b1932428097db1bc8cf5dc85b" title="四、今日最值得我看的 3 篇 / 3 个链接"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">四、今日最值得我看的 3 篇 / 3 个链接</span></span></h2><div class="notion-text notion-block-412e4172a84f4adf89339f564662e24c"><b>第 1 位：</b><b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://Emergent.sh">Emergent.sh</a></b><b> — Claude Code vs Cursor 深度对比</b></div><div class="notion-text notion-block-6a98d78f9c2f4c69906c6a484650e320">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://emergent.sh/learn/claude-code-vs-cursor">https://emergent.sh/learn/claude-code-vs-cursor</a></div><div class="notion-text notion-block-3d83014f8ecd4623a24e3b4d387d95d0">直接影响你明天开始写代码的效率。文章给出了可量化的 token 对比数据（5.5x 差距），还有具体的混合使用策略。10 分钟读完，立即可落地。</div><div class="notion-text notion-block-a67c4b1c951d415db205bea97281db52"><b>第 2 位：BuildFastWithAI — 2026 年 3 月 AI 模型综述</b></div><div class="notion-text notion-block-c170ace79f5b477db36c4cf1822052a5">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.buildfastwithai.com/blogs/ai-models-march-2026-releases">https://www.buildfastwithai.com/blogs/ai-models-march-2026-releases</a></div><div class="notion-text notion-block-996fd4ed2e734b36b0f3f87f1f462f2e">把本月所有重要模型发布整理得非常清晰，配有 benchmark 对比表。用 20 分钟了解整个 3 月的模型格局变化，性价比极高。</div><div class="notion-text notion-block-9c1e23c9551e48b4a3cdb5141764f334"><b>第 3 位：LangGraph 官方教程 — 多 agent 工作流教程</b></div><div class="notion-text notion-block-0a896e37a446485d846431880543e147">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://langchain-ai.github.io/langgraph/tutorials/">https://langchain-ai.github.io/langgraph/tutorials/</a></div><div class="notion-text notion-block-33097e272e564b908874aad309b12c8b">agent 框架已经是工程标配，LangGraph 是最值得投入时间的框架。官方教程文档质量高，直接从「攀岩视频分析 agent」出发设计学习路径，边学边建项目。</div><hr class="notion-hr notion-block-ef7bdf1c5ec840f198db54f4d6605281"/><h2 class="notion-h notion-h1 notion-h-indent-0 notion-block-d90801b8671c41419f7b830eaeb5a691" data-id="d90801b8671c41419f7b830eaeb5a691"><span><div id="d90801b8671c41419f7b830eaeb5a691" class="notion-header-anchor"></div><a class="notion-hash-link" href="#d90801b8671c41419f7b830eaeb5a691" title="五、今日行动清单"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">五、今日行动清单</span></span></h2><div class="notion-text notion-block-9354cbdf2e0d46d983abac318ad33a1a"><b>1. 今天值得收藏但不必立刻看的</b></div><ul class="notion-list notion-list-disc notion-block-9e32f2dfdfdd43b9bf64c25ae8910a9c"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://mistral.ai/news/mistral-3">Mistral Small 4 技术报告</a> — 等项目用到 multimodal 时重读</li></ul><ul class="notion-list notion-list-disc notion-block-f18b5972df4c4e50bfe8a38fda832d03"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blogs.nvidia.com/blog/gtc-2026-news/">NVIDIA Dynamo 架构文档</a> — inference 方向深入时再看</li></ul><ul class="notion-list notion-list-disc notion-block-7b9fa8cd46474499aef840476b26d993"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.mean.ceo/ai-startup-funding-news-march-2026/">AMI Labs / JEPA 架构介绍</a> — 技术细节尚未公开，先收藏跟踪</li></ul><ul class="notion-list notion-list-disc notion-block-3dc47d0b3fea4fe2a247aabec838972e"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://ragflow.io/blog/rag-review-2025-from-rag-to-context">RAGFlow Blog</a> — 下次做 RAG 时优先读</li></ul><div class="notion-text notion-block-18e53c4b896742798baba28b32c07695"><b>2. 今天值得精读的</b></div><ul class="notion-list notion-list-disc notion-block-b252221e0b26477bb034bb892b3dd4ea"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://Emergent.sh">Emergent.sh</a><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://emergent.sh/learn/claude-code-vs-cursor"> Claude Code vs Cursor</a> — 立即影响开发效率，今天读完</li></ul><ul class="notion-list notion-list-disc notion-block-29c03b6a6acc48a89648b99be29fc419"><li><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.buildfastwithai.com/blogs/ai-models-march-2026-releases">BuildFastWithAI 3 月模型综述</a> — 补完本月模型认知缺口</li></ul><div class="notion-text notion-block-ac1bc964bc004d729929d31e902f7b3d"><b>3. 今天值得复现/试用的</b></div><ul class="notion-list notion-list-disc notion-block-50371087be364d57932a016e743a2f37"><li><b>deer-flow</b>：跳通字节的长程 agent demo，理解多步规划架构</li></ul><ul class="notion-list notion-list-disc notion-block-74f45cd64e54404fbcb7f6edbd969e01"><li><b>DeepEval</b>：<code class="notion-inline-code">pip install deepeval</code>，用一个简单 demo 体验 eval pipeline</li></ul><ul class="notion-list notion-list-disc notion-block-a2c3c06d7b2e4287889da7912080ee28"><li><b>Qwen 3.5 视频理解</b>：通过 HuggingFace 或 API 测试一段攀岩视频，看描述质量</li></ul><div class="notion-text notion-block-7c2572be49594b36953ec47e80f2c1ff"><b>4. 今天值得记到项目 roadmap 的</b></div><ul class="notion-list notion-list-disc notion-block-d4f0ffd313a24072a1e06a503bea8e19"><li>攀岩 app 技术选型：<b>Qwen 3.5 9B</b>（视频理解）+ <b>LangGraph</b>（agent pipeline）+ <b>DeepEval</b>（质量评估）</li></ul><ul class="notion-list notion-list-disc notion-block-3a6d15e7360c40ff93581faf4ed67379"><li>考虑增加「合成数据」模块：用 Kling 3.0 API 生成标准动作示范视频，解决标注数据稀缺问题</li></ul><ul class="notion-list notion-list-disc notion-block-06bd90c69c3946f7acfcb6447ac5446b"><li>长期研究方向：M-PCT + DGST 的 3D 姿态估计方法，适用于攀岩复杂遮挡场景</li></ul><ul class="notion-list notion-list-disc notion-block-ce70772324b54042892eff2a5eeacdf0"><li>iOS 版本规划：待 Core AI Framework 正式发布后，考虑 on-device 推断方案</li></ul><div class="notion-text notion-block-42b9d0eb6b6a4411a38cc86e3fc887f3"><b>5. 今天面试里可以拿来讲的 1~2 个点</b></div><div class="notion-text notion-block-14348d1fd328459ab27a7a399c7c4a2b">① <b>推断优化</b>：“随着 NVIDIA 宣布推断时代到来，我正在学习 speculative decoding 和 continuous batching。Dynamo 开源让我可以直接研究推断 OS 的架构设计，这将是 LLM 工程的下一个核心技能。”</div><div class="notion-text notion-block-551b5124481642dcb236f4526bf927aa">② <b>AI 应用的 eval 设计</b>：“我在攀岩动作分析 app 中使用 DeepEval 建立了 LLM 输出质量评估 pipeline，用 G-Eval 指标衡量动作描述的准确性和可操作性，这是从 demo 到产品的关键一步。”</div><hr class="notion-hr notion-block-2a14a46b6ec04b8393ebbbcdf96933da"/><div class="notion-text notion-block-b40240c2744241978e3181bd66b32f4a"><em>🤖 AI 日报由 Claude 自动生成 | 数据截至 2026-03-26 | 如有遗漏或错误欢迎反馈</em></div></main></div>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[AI 日报 | 2026-03-25]]></title>
            <link>https://dundun0504.com/article/32e670e5-5499-8126-8233-fc57f48b3387</link>
            <guid>https://dundun0504.com/article/32e670e5-5499-8126-8233-fc57f48b3387</guid>
            <pubDate>Wed, 25 Mar 2026 00:00:00 GMT</pubDate>
            <description><![CDATA[2026-03-25 AI 日报：Windsurf Wave 13 发布 SWE-1.5（78% SWE-bench）+ 并行 Agent；GPT-5.4 原生 computer use（OSWorld 75%）进入 Codex；BitNet.cpp HN 持续热议；攀岩 hold 检测新数据集 + Belay AI 产品发布；MoTok 运动生成新思路。]]></description>
            <content:encoded><![CDATA[<div id="notion-article" class="mx-auto overflow-hidden "><main class="notion light-mode notion-page notion-block-32e670e5549981268233fc57f48b3387"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><blockquote class="notion-quote notion-block-3b83db3d57034528ba7179004bc4f8bc"><div>📋 <b>今日亮点</b>：Windsurf Wave 13 是本周编码工具最大升级，直接可用；GPT-5.4 computer use 改变 agentic 范式；攀岩方向出现 hold 检测新数据集与商业产品 Belay AI；BitNet.cpp 仍是 edge 部署最值得关注的进展。优先看一（1、2、3条）和 C 部分。</div></blockquote><hr class="notion-hr notion-block-c54e8043b3a8493d9b5aef0ca972f590"/><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-ff171153d9104825b1e368d67d429f45" data-id="ff171153d9104825b1e368d67d429f45"><span><div id="ff171153d9104825b1e368d67d429f45" class="notion-header-anchor"></div><a class="notion-hash-link" href="#ff171153d9104825b1e368d67d429f45" title="一、今日最重要的 5 条"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">一、今日最重要的 5 条</span></span></h3><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-5e27b07dde5a484ea6cb282d17dc3fbf" data-id="5e27b07dde5a484ea6cb282d17dc3fbf"><span><div id="5e27b07dde5a484ea6cb282d17dc3fbf" class="notion-header-anchor"></div><a class="notion-hash-link" href="#5e27b07dde5a484ea6cb282d17dc3fbf" title="1. 🔥 Windsurf Wave 13：免费 SWE-1.5 + 并行 Agent + Git Worktrees"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">1. 🔥 Windsurf Wave 13：免费 SWE-1.5 + 并行 Agent + Git Worktrees</span></span></h4><div class="notion-text notion-block-260fca402eaf41fdb93a8e3129881f3b"><b>发生了什么：</b> Windsurf 发布 Wave 13，核心是三件事：① SWE-1.5 模型取代 SWE-1 成为默认，78% SWE-bench Verified（超越大多数付费 tier 模型），对所有用户免费开放至月底；② 原生并行 Agent 支持——5 个 agent 同时跑 5 个 bug，通过 Git Worktree 隔离不同分支，侧边栏多窗格监控；③ Cascade Hooks 支持在 agent 工作流关键节点执行自定义命令；Arena Mode 可并排对比两个模型。</div><div class="notion-text notion-block-dbf35002fc1b498f9dbe5f40429bd314"><b>为什么重要：</b> 并行 Agent + Git Worktree 是 agentic coding 的系统性升级，不再是「chat 帮你写代码」，而是「多个 agent 同时推进项目不同部分」。这是 2026 AI IDE 大战的关键转折点。</div><div class="notion-text notion-block-e71589eb477d45ef99208eaa6e4ac3b5"><b>对你的关系：</b> 开发攀岩 app 时可用 Windsurf Wave 13 同时跑「pose estimation 模块」「feedback 生成模块」「前端 UI」三个 agent 并行开发，效率翻倍。SWE-1.5 免费且性能强，值得今天就切换过去。</div><div class="notion-text notion-block-a12b83cb2050467487f2819eb0abf385">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://windsurf.com/changelog">Windsurf Wave 13 官方 changelog</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://byteiota.com/windsurf-wave-13-free-swe-1-5-parallel-agents-escalate-ai-ide-war/">测试报告</a></div><hr class="notion-hr notion-block-db42e9ae00844d1c89467801905504f7"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-652f439c19c3483db666bd6323ec8fe7" data-id="652f439c19c3483db666bd6323ec8fe7"><span><div id="652f439c19c3483db666bd6323ec8fe7" class="notion-header-anchor"></div><a class="notion-hash-link" href="#652f439c19c3483db666bd6323ec8fe7" title="2. 🔥 GPT-5.4 原生 Computer Use + OSWorld-V 75% 进入 Codex"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">2. 🔥 GPT-5.4 原生 Computer Use + OSWorld-V 75% 进入 Codex</span></span></h4><div class="notion-text notion-block-e32e696848174c81b0c62ccb823ef9b2"><b>发生了什么：</b> OpenAI GPT-5.4 正式进入 Codex，带来原生 computer use 能力。关键数据：OSWorld-V 75%（人类基线 72.4%，首次超越人类）；1M token context；Token 使用效率比 GPT-5.2 节省 33%；GPT-5.4 mini 同步上线作为 Codex 轻量子模型（速度 2x，消耗 30% 额度）。</div><div class="notion-text notion-block-e22a23ebb65448feb69187e8f2625942"><b>为什么重要：</b> &quot;控制桌面&quot; 不再是 demo，是真实 GA 能力，且性能超人类。这意味着 coding agent 从「生成代码片段」进化到「自主操作 IDE、浏览器、终端执行完整任务」，agentic 工程范式迎来质变。</div><div class="notion-text notion-block-654c2954adb3496e904b7649fa5b479e"><b>对你的关系：</b> Codex 的并行沙箱执行 + 自动创建 PR，是目前云端 coding agent 的最强选项之一（无需本地部署）。注意与 Windsurf 对比选择。</div><div class="notion-text notion-block-f7b4bb1934cb4eddb3cdd2ea970b9365">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://openai.com/index/introducing-gpt-5-4/">OpenAI GPT-5.4 发布页</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://developers.openai.com/codex/changelog">Codex changelog</a></div><hr class="notion-hr notion-block-37d4b9ff303e4ff7966bc232784583df"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-dea1e146a14d4ab2830efe9aa27fec61" data-id="dea1e146a14d4ab2830efe9aa27fec61"><span><div id="dea1e146a14d4ab2830efe9aa27fec61" class="notion-header-anchor"></div><a class="notion-hash-link" href="#dea1e146a14d4ab2830efe9aa27fec61" title="3. 🔥 BitNet.cpp：单 CPU 跑 100B 模型，HN 讨论仍在持续"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">3. 🔥 BitNet.cpp：单 CPU 跑 100B 模型，HN 讨论仍在持续</span></span></h4><div class="notion-text notion-block-409f6243d9cf4fe89812c140e5fe57b2"><b>发生了什么：</b> Microsoft 官方 1-bit LLM 推理框架持续成为 GitHub trending 和 HN 讨论热点。技术指标：ARM CPU 加速 1.37x-5.07x，能耗降低 55-82%；x86 CPU 加速 2.37x-6.17x，能耗降低 71-82%；单 CPU 可运行 100B 参数模型（5-7 tokens/sec，接近阅读速度）。基于 llama.cpp，MIT 协议开源。</div><div class="notion-text notion-block-dc6a508079144775a49e5f2694ef46b0"><b>为什么重要：</b> GPU-free edge AI 从理论走向可用。社区真正关心的问题从「能不能跑」变成了「1-bit 在哪些任务已经够用」。这是 mobile 部署路径的核心技术节点。</div><div class="notion-text notion-block-078de57c27744c51aa74c3bd0f117919"><b>对你的关系：</b> 攀岩 app 长期 mobile 部署路径（手机端实时分析无需服务器），BitNet.cpp 是最直接的技术参考。</div><div class="notion-text notion-block-4a8e6721b8204ba19e8746e63babb81f">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/microsoft/BitNet">microsoft/BitNet</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://news.ycombinator.com/item?id=47334694">HN 讨论</a></div><hr class="notion-hr notion-block-e2685518755c4188af5ec5bcff71a19f"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-4ecbc1729b6b4ca5b44ec5d02de2fbeb" data-id="4ecbc1729b6b4ca5b44ec5d02de2fbeb"><span><div id="4ecbc1729b6b4ca5b44ec5d02de2fbeb" class="notion-header-anchor"></div><a class="notion-hash-link" href="#4ecbc1729b6b4ca5b44ec5d02de2fbeb" title="4. AlphaEvolve 向国家实验室扩展 + OpenEvolve 开源复现上线"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">4. AlphaEvolve 向国家实验室扩展 + OpenEvolve 开源复现上线</span></span></h4><div class="notion-text notion-block-3933f55c8de44f7ab011111fc25ad678"><b>发生了什么：</b> Google DeepMind AlphaEvolve（Gemini 驱动的进化算法 coding agent）在 3 月向美国能源部国家实验室扩展访问权限。同时，社区出现 OpenEvolve 开源实现（Hugging Face 博客），可复现 AlphaEvolve 的核心框架。AlphaEvolve 已在 Google 内部运行 &gt;1 年；节省 0.7% 全球算力；Gemini kernel 加速 23%；数学上发现新结构。</div><div class="notion-text notion-block-7695a201a6fc4c48b2227e3d313f5671"><b>为什么重要：</b> OpenEvolve 的出现意味着「LLM + 进化算法做 algorithm discovery」首次对外可复现，不再是 Google 内部黑盒。</div><div class="notion-text notion-block-6f06a59e521f426bbddc4eed33cff2ea"><b>对你的关系：</b> 了解原理即可；OpenEvolve 可作为「用 LLM 自动优化代码/算法」的研究工具参考。</div><div class="notion-text notion-block-80f64bc33f8b4ad09dfd66966cb521b8">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://deepmind.google/blog/alphaevolve-a-gemini-powered-coding-agent-for-designing-advanced-algorithms/">AlphaEvolve DeepMind Blog</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/blog/codelion/openevolve">OpenEvolve on HuggingFace</a></div><hr class="notion-hr notion-block-0e3964f569c04acba4851e0f1acc2be5"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-898aea8ad0b247cd8685ba0ce1ee91c6" data-id="898aea8ad0b247cd8685ba0ce1ee91c6"><span><div id="898aea8ad0b247cd8685ba0ce1ee91c6" class="notion-header-anchor"></div><a class="notion-hash-link" href="#898aea8ad0b247cd8685ba0ce1ee91c6" title="5. 「The Way Up」攀岩 Hold 检测数据集发布（arXiv:2505.12854）"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">5. 「The Way Up」攀岩 Hold 检测数据集发布（arXiv:2505.12854）</span></span></h4><div class="notion-text notion-block-1ba2e83792a846b7a42444296d92cd5d"><b>发生了什么：</b> 新论文「The Way Up: A Dataset for Hold Usage Detection in Sport Climbing」发布攀岩 hold 使用检测专用数据集，基于 2D keypoint pose estimation 检测关节与岩点的重叠关系，分析 hold 使用顺序和效率，面向真实攀岩场景。</div><div class="notion-text notion-block-8ffd169743204408823d05ef84ca70c5"><b>为什么重要：</b> 攀岩 AI 领域数据集极度稀缺，任何新公开数据集都是重要资源。「Hold 使用检测」是攀岩动作分析的核心子任务（判断选手用了哪个 hold、次序、效率）。</div><div class="notion-text notion-block-5fed8e640f1a48ff8dbbcda64d521713"><b>对你的关系：</b> 直接填补你 app 的核心功能缺口（判断选手用了哪个 hold、次序、效率）。今天就点开看数据集是否可获取。</div><div class="notion-text notion-block-d94a79b696414d7aa7d7ac2e22f8b336">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2505.12854">arXiv:2505.12854</a></div><hr class="notion-hr notion-block-4ba8cca851d64cddaebfd3b582d7279a"/><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-c375cae327e54b0bab779d91252c18b7" data-id="c375cae327e54b0bab779d91252c18b7"><span><div id="c375cae327e54b0bab779d91252c18b7" class="notion-header-anchor"></div><a class="notion-hash-link" href="#c375cae327e54b0bab779d91252c18b7" title="二、按目标分类"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">二、按目标分类</span></span></h3><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-a90d88dc238748ba89a9af1f1e57235e" data-id="a90d88dc238748ba89a9af1f1e57235e"><span><div id="a90d88dc238748ba89a9af1f1e57235e" class="notion-header-anchor"></div><a class="notion-hash-link" href="#a90d88dc238748ba89a9af1f1e57235e" title="A. 前沿模型 / 一手发布"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">A. 前沿模型 / 一手发布</span></span></h4><div class="notion-text notion-block-0d15a860524d48abab5bc538a0d91d0a"><b>【GPT-5.4：原生 Computer Use，首个超人类桌面操作模型】</b></div><ul class="notion-list notion-list-disc notion-block-e24ef43a6fd9496a81eb42a178c00fd4"><li>事件：OpenAI GPT-5.4 发布并进入 Codex，具备原生 computer use</li></ul><ul class="notion-list notion-list-disc notion-block-8096d6ab05b640e9b3a64bb096dcc426"><li>核心内容：OSWorld-V 75%（人类 72.4%）；1M context；token 效率提升 33%；desktop productivity 任务超越人类</li></ul><ul class="notion-list notion-list-disc notion-block-846b6f9a87f84422aa91fb125d6ab367"><li>为什么重要：从「描述操作」到「真正执行操作」是质变；agentic 工程设计范式改变</li></ul><ul class="notion-list notion-list-disc notion-block-9309676512fd448d9e3cac5d452cd637"><li>我需不需要点开：需要——了解 computer use API，对 agent 工程有直接参考价值</li></ul><ul class="notion-list notion-list-disc notion-block-c0514f2caa1b4a7889f7213100a1e94a"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://openai.com/index/introducing-gpt-5-4/">https://openai.com/index/introducing-gpt-5-4/</a></li></ul><div class="notion-text notion-block-bb15438d89954860b8be9626a48bca4e"><b>【Gemini 3.1 Flash-Lite：$0.25/M tokens，高频调用首选】</b></div><ul class="notion-list notion-list-disc notion-block-07892ab611bc4fefbec8ac5b6a1d663c"><li>事件：Google 发布效率导向新品 Gemini 3.1 Flash-Lite，定价极具竞争力</li></ul><ul class="notion-list notion-list-disc notion-block-2c5fe4a1ef9b4376a1199e081f06d0e6"><li>核心内容：比前代快 2.5x，输出速度快 45%，input 仅 $0.25/M tokens</li></ul><ul class="notion-list notion-list-disc notion-block-dc27e12c23704e83835e9fc4d1c36648"><li>为什么重要：对高频调用 app（视频帧批量分析）极具性价比</li></ul><ul class="notion-list notion-list-disc notion-block-af6acfccc4eb4f5fbeb2d02dc646e059"><li>我需不需要点开：需要关注定价——攀岩 app 视频帧批量分析的直接成本优化选项</li></ul><ul class="notion-list notion-list-disc notion-block-6fe80aee28ef4843ac5bd99e86b6a487"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://llm-stats.com/llm-updates">https://llm-stats.com/llm-updates</a></li></ul><div class="notion-text notion-block-feb3961fec8b4c82a5d219d937ef17dd"><b>【Claude Sonnet 4.6 Memory 全量上线（3 月）】</b></div><ul class="notion-list notion-list-disc notion-block-4ad19258bba04ab9a97fbec3171afa79"><li>事件：Anthropic 将 Claude 跨对话 memory 功能推送给所有用户</li></ul><ul class="notion-list notion-list-disc notion-block-4248aa0692164194b09a7107097c6d0a"><li>核心内容：记住用户偏好、项目上下文、工作风格，跨对话持久化</li></ul><ul class="notion-list notion-list-disc notion-block-7ab4b7f7a4894bbe92a724614d77fd0c"><li>为什么重要：开发工具 workflow 改变，Claude 不再需要每次重新介绍项目背景</li></ul><ul class="notion-list notion-list-disc notion-block-9d863877ba574a6eb90506c39221f4ac"><li>我需不需要点开：了解即可；可立刻试用（你已有 Claude 账号）</li></ul><ul class="notion-list notion-list-disc notion-block-817965d6d8a441dbb9cdbc2bf7c3a8b2"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://renovateqr.com/blog/ai-model-releases-2026">https://renovateqr.com/blog/ai-model-releases-2026</a></li></ul><hr class="notion-hr notion-block-f340569e83a34ad9936e33b259c665be"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-1d0b2ef8dea544e28bf2e4f09df28851" data-id="1d0b2ef8dea544e28bf2e4f09df28851"><span><div id="1d0b2ef8dea544e28bf2e4f09df28851" class="notion-header-anchor"></div><a class="notion-hash-link" href="#1d0b2ef8dea544e28bf2e4f09df28851" title="B. AI 工程 / Agent / Coding Workflow"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">B. AI 工程 / Agent / Coding Workflow</span></span></h4><div class="notion-text notion-block-0ae48b26903a49c087ccc05258bc2c2c"><b>【Windsurf Wave 13：并行 Agent + SWE-1.5 免费】</b></div><ul class="notion-list notion-list-disc notion-block-57ee5e63b11c422b8d3e6a97ddc21bc6"><li>内容：SWE-1.5 模型（78% SWE-bench），并行 multi-agent，Git Worktree 隔离，Cascade Hooks，Arena Mode 模型对比</li></ul><ul class="notion-list notion-list-disc notion-block-b6bf8d80ef7f44c6b37d5af8323e0a27"><li>可落地价值：同时运行多个 coding agent 处理不同模块，开发效率翻倍；Git Worktree 避免分支冲突</li></ul><ul class="notion-list notion-list-disc notion-block-09363fdb9b284170880c56dcb6bbc29e"><li>对我当前开发/学习的意义：今天就可切换到 Wave 13；并行 agent 直接加速攀岩 app 多模块开发</li></ul><ul class="notion-list notion-list-disc notion-block-a50a3ab0e645487c9bf25ded5ca72e96"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://windsurf.com/changelog">https://windsurf.com/changelog</a></li></ul><div class="notion-text notion-block-8e51c4e8e2074d8aa3e03d3bbaeea4f5"><b>【OpenAI Codex：云端 Agent，并行沙箱 + 自动 PR】</b></div><ul class="notion-list notion-list-disc notion-block-3052d4aaaf1840be97b095e5a29f8129"><li>内容：Codex 现在搭载 GPT-5.4，支持并行沙箱执行、深度 GitHub 集成、自动创建 PR</li></ul><ul class="notion-list notion-list-disc notion-block-f203097c884f4fb5b106b022a826b2fc"><li>可落地价值：GitHub issue → agent 自动实现 + 创建 PR，无需本地环境；适合快速迭代</li></ul><ul class="notion-list notion-list-disc notion-block-bfc77236628c457490cc7c303ab1f45c"><li>对我当前开发/学习的意义：与 Windsurf 配合使用——Windsurf 本地，Codex 云端后台跑长任务</li></ul><ul class="notion-list notion-list-disc notion-block-d98c701a018243ba88b204b1f38e6f0f"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://developers.openai.com/codex/changelog">https://developers.openai.com/codex/changelog</a></li></ul><div class="notion-text notion-block-a967ef0b898840d382a1684b845be64e"><b>【AI 开发工具 2026 市场格局：CLI / IDE-Native / Autonomous 三分】</b></div><ul class="notion-list notion-list-disc notion-block-6737e3ae724548929bc2bb384755d4ad"><li>内容：Windsurf #1（Wave 13），Antigravity #2（革命性免费定价），Codex 重返前五，Cursor 持续竞争；三类工具（CLI agent / IDE 内嵌 / 云端自主 agent）各有最优场景</li></ul><ul class="notion-list notion-list-disc notion-block-077a788b57b54268af11beaca713bf39"><li>可落地价值：帮你选对工具而不是盲目跟风；CLI 最灵活，IDE 最顺手，云端最省心</li></ul><ul class="notion-list notion-list-disc notion-block-4a1ee3160e8e41bbabbd5d57c7309c70"><li>对我当前开发/学习的意义：Windsurf（IDE）+ Codex（云端自主）是当前最优组合</li></ul><ul class="notion-list notion-list-disc notion-block-918690e6eb4a4e7c90136f3d1a7205e7"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.logrocket.com/ai-dev-tool-power-rankings/">https://blog.logrocket.com/ai-dev-tool-power-rankings/</a></li></ul><div class="notion-text notion-block-cfb078a55aa640b08fda6230e0545411"><b>【Addison Osmani：My LLM Coding Workflow Going Into 2026】</b></div><ul class="notion-list notion-list-disc notion-block-0f3d78c3af844a5b84a494a9d2c3f5df"><li>内容：Google Chrome 工程师分享真实 LLM coding workflow（不是玄学，是实际操作流程）</li></ul><ul class="notion-list notion-list-disc notion-block-5ae9d2971f29431182f3741957d1c0a4"><li>可落地价值：高质量信噪比分享，帮你标准化自己的 AI 辅助开发 workflow</li></ul><ul class="notion-list notion-list-disc notion-block-5cdbeb10cc564439adc03f765f60b1ae"><li>对我当前开发/学习的意义：直接参考，优化你和 coding agent 协作的实际流程</li></ul><ul class="notion-list notion-list-disc notion-block-ac68f47891404501b84429c7c697e2b2"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://addyosmani.com/blog/ai-coding-workflow/">https://addyosmani.com/blog/ai-coding-workflow/</a></li></ul><hr class="notion-hr notion-block-4ae0910bd77b4f9db3a393ec61a1be01"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-00eb62083103477db5efd93fd3d98abc" data-id="00eb62083103477db5efd93fd3d98abc"><span><div id="00eb62083103477db5efd93fd3d98abc" class="notion-header-anchor"></div><a class="notion-hash-link" href="#00eb62083103477db5efd93fd3d98abc" title="C. 视觉 / 视频 / 运动人体分析"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">C. 视觉 / 视频 / 运动人体分析</span></span></h4><div class="notion-text notion-block-2ae7cc86fc2d492da08a991384be84e8"><b>【⭐ 高优先级】「The Way Up」：攀岩 Hold 使用检测专用数据集（arXiv:2505.12854）</b></div><ul class="notion-list notion-list-disc notion-block-6111373aeba043ea85121908193c8dfd"><li>内容：针对攀岩 hold 使用检测的专用数据集，基于 2D keypoint pose estimation 检测关节与岩点重叠，分析 hold 使用顺序和效率</li></ul><ul class="notion-list notion-list-disc notion-block-27a13a38f3584509b8cac1eb50d92bf7"><li>与「攀岩动作分析 app」的相关性：极高。Hold-level 分析（用了哪个 hold、次序、效率）是动作改进建议的核心依据之一</li></ul><ul class="notion-list notion-list-disc notion-block-b8bc7e2e8ed5463eb844aea54a59a0cd"><li>可迁移到项目的点：① 直接使用数据集训练 hold 检测模型；② keypoint-overlap 检测方法可迁移到你的 pose pipeline；③ 了解 hold 使用序列如何作为动作质量的评估维度</li></ul><ul class="notion-list notion-list-disc notion-block-c71eb1d3e31d438bb8ab265d447966e4"><li>优先级：高——今天就点开，看数据集是否可获取</li></ul><ul class="notion-list notion-list-disc notion-block-87e04b902cae4f239530a217596c39c7"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2505.12854">https://arxiv.org/abs/2505.12854</a></li></ul><div class="notion-text notion-block-7ac9554fb699449abd271a7c91e7a3dc"><b>【⭐ 高优先级】Belay AI：商业攀岩 AI 产品（</b><b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://belay.ai">belay.ai</a></b><b>）已上线</b></div><ul class="notion-list notion-list-disc notion-block-95843ee0befa48dd8e1d600e65acfc30"><li>内容：专注攀岩的 AI 分析商业产品，功能包括：体部关键点速度/方向估计、重心追踪、动态动作分析；面向真实攀岩者</li></ul><ul class="notion-list notion-list-disc notion-block-dc9bd6ea7e0e4f89b81276e079e66515"><li>与「攀岩动作分析 app」的相关性：直接竞品 + 参考标杆。了解其功能边界和体验短板有助于你找差异化</li></ul><ul class="notion-list notion-list-disc notion-block-8dd57acd8bf14902b891411baf0b473f"><li>可迁移到项目的点：① 重心追踪（center of gravity）是你 app 可加入的高价值功能；② 商业产品 UI/UX 参考；③ 分析其技术栈（待研究）</li></ul><ul class="notion-list notion-list-disc notion-block-5475a71906c442dbb044582a9a816307"><li>优先级：高——今天去 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://belay.ai">belay.ai</a> 注册体验，观察功能和体验短板（30 分钟）</li></ul><ul class="notion-list notion-list-disc notion-block-2d34f200517d448b8882bbfc5e022067"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://belay.ai">https://belay.ai</a></li></ul><div class="notion-text notion-block-68ea64ea13564598ae67799cc128a42c"><b>【中优先级】MotionLLM：视频 + 动作序列联合理解（arXiv:2405.20340）</b></div><ul class="notion-list notion-list-disc notion-block-5e8b6fef7bb543c9a45895fbc30d343b"><li>内容：统一视频帧 + 运动序列（SMPL motion）作为 LLM 输入，联合训练 video-text 和 motion-text，实现人体行为理解</li></ul><ul class="notion-list notion-list-disc notion-block-57fc57097d034f8d9f2d0be4cb9f8fb2"><li>与「攀岩动作分析 app」的相关性：中高。「上传视频 → 理解动作 → 生成建议」的架构可参考 MotionLLM 的 joint modeling 设计</li></ul><ul class="notion-list notion-list-disc notion-block-d961fc8f1b664e2eb3bde3d80d6ec8a7"><li>可迁移到项目的点：视频和骨架动作序列的联合编码方式；不需要 clean mocap 数据也能训练</li></ul><ul class="notion-list notion-list-disc notion-block-c84b91094d1d4a52be5bc6f8c1050d16"><li>优先级：中——收藏，精读列入下周计划</li></ul><ul class="notion-list notion-list-disc notion-block-51c23408c31f4970adae4cd83065d917"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2405.20340">https://arxiv.org/abs/2405.20340</a></li></ul><div class="notion-text notion-block-1e7c3799381b4e02b352073a2c179f9b"><b>【中优先级】Climbing Technique Evaluation via Skeleton Video Stream（MDPI Sensors）</b></div><ul class="notion-list notion-list-disc notion-block-a0c62b8ecf9b48aaac07624b5b3f208a"><li>内容：基于骨架视频流的攀岩技术评估，使用 keypoint 序列分析攀岩技术</li></ul><ul class="notion-list notion-list-disc notion-block-a6fb20a1061247ada7f6a7a99e6de6b7"><li>与「攀岩动作分析 app」的相关性：中。骨架序列分析方法论参考</li></ul><ul class="notion-list notion-list-disc notion-block-4c2a3f23113a4000bc4fcd80915808bb"><li>可迁移到项目的点：骨架序列特征提取→技术评估的完整 pipeline 思路</li></ul><ul class="notion-list notion-list-disc notion-block-8343d7df53514413b26c7f688d22d5f5"><li>优先级：中——收藏备查</li></ul><ul class="notion-list notion-list-disc notion-block-546f716faa2d4cc1b04bea52d2d1e0fa"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.mdpi.com/1424-8220/23/19/8216">https://www.mdpi.com/1424-8220/23/19/8216</a></li></ul><hr class="notion-hr notion-block-07669102d8af4ad59eecaabbc8198e37"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-f7649246f8c3482e88be56697e5f8713" data-id="f7649246f8c3482e88be56697e5f8713"><span><div id="f7649246f8c3482e88be56697e5f8713" class="notion-header-anchor"></div><a class="notion-hash-link" href="#f7649246f8c3482e88be56697e5f8713" title="D. 产品化 / 商业化 / 行业动态"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">D. 产品化 / 商业化 / 行业动态</span></span></h4><div class="notion-text notion-block-ca5122d0b0b44b7d9fc0734fecc0a400"><b>【2026 AI 从 Hype 到 Pragmatism：垂直场景 + 真实落地 &gt; 参数竞赛】</b></div><ul class="notion-list notion-list-disc notion-block-33822ea5a9b64a13a863134e5b50c82a"><li>动态：TechCrunch、MIT Tech Review 等多家权威媒体判断 2026 年是 AI 转向实用主义的转折年</li></ul><ul class="notion-list notion-list-disc notion-block-56ddbccad6254a7b8925d911ff5ddcea"><li>背后的趋势判断：大模型能力差距缩小（GPT-5.4 ≈ Claude 4.6 ≈ Gemini 3.1 在大多数任务上），应用层是真正的竞争场；企业从 pilot 进入 production</li></ul><ul class="notion-list notion-list-disc notion-block-ba37abd631594fc8b70c58320c302b80"><li>对 side project / 求职 / 项目方向的启发：做垂直场景 AI 应用（如攀岩 app）比通用工具更有差异化；「能落地、有真实用户」比「用了最新技术」更打动面试官</li></ul><ul class="notion-list notion-list-disc notion-block-92f8494a96de443488664ac68fe262e7"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://techcrunch.com/2026/01/02/in-2026-ai-will-move-from-hype-to-pragmatism/">https://techcrunch.com/2026/01/02/in-2026-ai-will-move-from-hype-to-pragmatism/</a></li></ul><div class="notion-text notion-block-99099ffb07854b10a9e1fdc9a8681391"><b>【Agentic AI 进入生产阶段：Autonomous Workflow 成为 2026 主旋律】</b></div><ul class="notion-list notion-list-disc notion-block-1e8670dbf8884bd0b2b600558cb456d4"><li>动态：多个分析报告确认 agentic workflow 从 demo 进入日常开发实践；长任务自主执行（hours, not seconds）成为新范式</li></ul><ul class="notion-list notion-list-disc notion-block-e9b78d1d9fba479cb3d6c4b4e9969f17"><li>背后的趋势判断：AI 工程师的价值从「写 prompt」变成「设计 agent workflow 和 evaluation」</li></ul><ul class="notion-list notion-list-disc notion-block-e87eeca3f8774dc58936ccac445b8ff4"><li>对 side project / 求职 / 项目方向的启发：项目中展示「我设计并评估了 agent workflow」比「我用了 GPT-4」更有说服力</li></ul><ul class="notion-list notion-list-disc notion-block-22c17057288241a7bb5b5f67cf29c0d1"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://dev.to/ajay_kumar_1daef5fe089885/ai-developer-tools-enter-autonomous-era-agentic-systems-rise-in-march-2026-1f38">https://dev.to/ajay_kumar_1daef5fe089885/ai-developer-tools-enter-autonomous-era-agentic-systems-rise-in-march-2026-1f38</a></li></ul><div class="notion-text notion-block-ec78813227d74ac79c0b607ff7780bf0"><b>【Sports AI Vertical：攀岩 AI 从研究走向商业产品】</b></div><ul class="notion-list notion-list-disc notion-block-6ea0e41219c04c37bc4ad172883a4096"><li>动态：Belay AI 上线，提供面向攀岩者的 AI 动作分析商业产品；学术界同期出现「The Way Up」数据集和多篇相关论文</li></ul><ul class="notion-list notion-list-disc notion-block-bdd44d2b9dd24aabb8d713a4c43289c1"><li>背后的趋势判断：垂直运动 AI 场景开始商业化，但市场仍处早期，技术壁垒在数据集 + domain knowledge</li></ul><ul class="notion-list notion-list-disc notion-block-13d03bf8287b4204982e425179ab6335"><li>对 side project / 求职 / 项目方向的启发：你的时机很好——市场有需求但竞争未充分；差异化在数据集深度 + 评估方法的专业性；<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://belay.ai">belay.ai</a> 是直接竞品，值得深度研究其功能缺口</li></ul><ul class="notion-list notion-list-disc notion-block-94d7b4eada2e4834ba4af3542d77523f"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://belay.ai">https://belay.ai</a></li></ul><hr class="notion-hr notion-block-d9f479be00b340379adf741798298fbe"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-fe5f5f70f0db402dbfd03bc0a030f0f3" data-id="fe5f5f70f0db402dbfd03bc0a030f0f3"><span><div id="fe5f5f70f0db402dbfd03bc0a030f0f3" class="notion-header-anchor"></div><a class="notion-hash-link" href="#fe5f5f70f0db402dbfd03bc0a030f0f3" title="E. 学习价值 / 求职价值"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">E. 学习价值 / 求职价值</span></span></h4><div class="notion-text notion-block-7ad13a293fa64f0d85a5b2e1b2aa0487"><b>【Windsurf Wave 13 并行 Agent 架构：面试里讲 agentic 工程的好素材】</b></div><ul class="notion-list notion-list-disc notion-block-04acf11e8eb54e75a9a05f0563da3bb4"><li>内容：Wave 13 的并行 agent + Git Worktree 设计，体现 multi-agent coordination 的工程实践</li></ul><ul class="notion-list notion-list-disc notion-block-b8b1b437d03c40e391763e6e88f49a45"><li>适合我怎么用：精读 changelog + 实际上手 + 面试表达。能讲「并行 agent 如何通过 Git Worktree 实现隔离、Cascade Hooks 如何做 workflow 控制点」，展示 agent 工程系统设计认知</li></ul><ul class="notion-list notion-list-disc notion-block-0491b1c6edb042029277b2008f32df98"><li>推荐动作：今天切换到 Wave 13，用它开发攀岩 app 并记录 workflow，写进项目经历</li></ul><ul class="notion-list notion-list-disc notion-block-5cf01f574bca4af3bd30237cf26e58f3"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://windsurf.com/changelog">https://windsurf.com/changelog</a></li></ul><div class="notion-text notion-block-29f48bc9296a4e78809dbf0dfeecb0de"><b>【「The Way Up」数据集 + Belay AI 竞品分析：面试差异化表达】</b></div><ul class="notion-list notion-list-disc notion-block-82dde5ac00ba4a51adc7de526c00dc41"><li>内容：攀岩 hold 检测数据集 + 商业竞品，构成你项目的「学术背书 + 市场调研」双维度</li></ul><ul class="notion-list notion-list-disc notion-block-b8f3235412c24ee2935acb2bb77588bc"><li>适合我怎么用：精读论文 + 体验竞品 + 面试表达。能说「我调研了 Belay AI 的功能边界，发现其在 hold-level feedback 颗粒度上的不足，并通过 The Way Up 数据集构建补充方案」，展示产品 + 技术双视角思考</li></ul><ul class="notion-list notion-list-disc notion-block-27b43185184b49d584a8c0dc9f7eb2b7"><li>推荐动作：今天注册 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://belay.ai">belay.ai</a> 体验，同时读「The Way Up」论文，撰写竞品分析笔记</li></ul><ul class="notion-list notion-list-disc notion-block-8d014be3f28d437d9a81c8319fcded85"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2505.12854">https://arxiv.org/abs/2505.12854</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://belay.ai">https://belay.ai</a></li></ul><div class="notion-text notion-block-c13ac500867a457c848790421f9b866f"><b>【Addison Osmani LLM Coding Workflow：实用 workflow 设计参考】</b></div><ul class="notion-list notion-list-disc notion-block-88ac20595ba64fccafd345b92febcce0"><li>内容：Google Chrome 工程师真实 AI 辅助开发 workflow，高质量信噪比分享</li></ul><ul class="notion-list notion-list-disc notion-block-e9cf02ce0ae8402282591e9af6a7144d"><li>适合我怎么用：精读 + 纳入自己的日常工作流。面试时可提「我的 AI 辅助开发 workflow 参考了 Addison Osmani 的方法，做了哪些个人化调整」</li></ul><ul class="notion-list notion-list-disc notion-block-692df9fffff3457f83bece05c147ef15"><li>推荐动作：今天读完，挑 2-3 个实践直接落地到攀岩 app 开发中</li></ul><ul class="notion-list notion-list-disc notion-block-3da8d05549a9479480c278600b4329fe"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://addyosmani.com/blog/ai-coding-workflow/">https://addyosmani.com/blog/ai-coding-workflow/</a></li></ul><hr class="notion-hr notion-block-ab222eccc5fb49a39341ffc9c636a5f9"/><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-da0b974d90024bd9bb9bdb77431aa23e" data-id="da0b974d90024bd9bb9bdb77431aa23e"><span><div id="da0b974d90024bd9bb9bdb77431aa23e" class="notion-header-anchor"></div><a class="notion-hash-link" href="#da0b974d90024bd9bb9bdb77431aa23e" title="三、今日高分 GitHub Repo"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">三、今日高分 GitHub Repo</span></span></h3><div class="notion-text notion-block-86c47ee020144fcbaadd2a999eff6646"><b>Repo 1：microsoft/BitNet</b></div><ul class="notion-list notion-list-disc notion-block-6df54ebce1244accb8c5ad5ee7d98b52"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/microsoft/BitNet">https://github.com/microsoft/BitNet</a></li></ul><ul class="notion-list notion-list-disc notion-block-e63b48fe458242fabc915cfb1747f71c"><li>方向标签：infra / deployment / edge</li></ul><ul class="notion-list notion-list-disc notion-block-3683035a5b0a4a24956726a9804b3def"><li>这项目是干什么的：Microsoft 官方 1-bit LLM 推理框架，CPU 上高效运行 1-bit LLMs（BitNet b1.58），无需 GPU</li></ul><ul class="notion-list notion-list-disc notion-block-c3a099f5aaa143059c67dcbba68e8249"><li>为什么今天值得关注：HN 持续热议；x86 CPU 加速最高 6.17x；是 2026 最重要的 edge inference 框架之一</li></ul><ul class="notion-list notion-list-disc notion-block-2b43d9eef42b4e4a876dd7af69a162ce"><li>与我的相关性：攀岩 app 手机端部署路径；长期 mobile inference 核心技术选型</li></ul><ul class="notion-list notion-list-disc notion-block-3a0dd597262248978f6e594870c4b3a7"><li>上手成本：中（需了解 quantization 基础概念）</li></ul><ul class="notion-list notion-list-disc notion-block-f44d8bf001924248a8eb19a4c5ad1d17"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-72fd37205e7142ecbb386b0faf8d840b"><li>是否建议我复现：可先跑官方 demo 验证速度（低门槛，一个小时内完成）</li></ul><ul class="notion-list notion-list-disc notion-block-5d696fc4b07e4072a9292437d7b4b300"><li>一句话判断：edge AI 重要基础设施，中期 mobile 部署必参考</li></ul><div class="notion-text notion-block-9f039db0e32b49c3b2931812dd5d4fbd"><b>Repo 2：caramaschiHG/awesome-ai-agents-2026</b></div><ul class="notion-list notion-list-disc notion-block-84357175296240b88c42a24be1d0dc5d"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/caramaschiHG/awesome-ai-agents-2026">https://github.com/caramaschiHG/awesome-ai-agents-2026</a></li></ul><ul class="notion-list notion-list-disc notion-block-749f792c7d294df29b3a93c18ee93bcc"><li>方向标签：agent / curated / dev tools</li></ul><ul class="notion-list notion-list-disc notion-block-67e33c44f9a6472fb49990bebbb0464b"><li>这项目是干什么的：2026 年 AI agent 框架和工具综合列表，300+ 资源，20+ 类别，每月更新</li></ul><ul class="notion-list notion-list-disc notion-block-3e278b737afa4647aa222a6aca2cf3b5"><li>为什么今天值得关注：月度更新，比任何博客文章更及时；帮你快速 survey agent 生态而不做重复调研</li></ul><ul class="notion-list notion-list-disc notion-block-eb4c5a73adac40638e4757c853182272"><li>与我的相关性：快速找到攀岩 app agent pipeline 适合的 framework</li></ul><ul class="notion-list notion-list-disc notion-block-9fb7f036269743d88a3745f8e851996e"><li>上手成本：低（直接看 README）</li></ul><ul class="notion-list notion-list-disc notion-block-458221b3a7124d0c96145c691cd50962"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-5eb397a96d1b411a8cf0f9657e60bafd"><li>是否建议我复现：否</li></ul><ul class="notion-list notion-list-disc notion-block-23f7d3e4ed004273ad7a7731b4a77886"><li>一句话判断：agent 生态地图，收藏备查，每月花 10 分钟过一遍</li></ul><div class="notion-text notion-block-efc31bda45154d14899d2ea78799cccb"><b>Repo 3：OpenHands/OpenHands</b></div><ul class="notion-list notion-list-disc notion-block-fb86b912678543f18f2d9b45f13670ff"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/OpenHands/OpenHands">https://github.com/OpenHands/OpenHands</a></li></ul><ul class="notion-list notion-list-disc notion-block-64e7402f603e4a1aad8dc8e8ca216fc7"><li>方向标签：agent / coding / dev tools</li></ul><ul class="notion-list notion-list-disc notion-block-afc6fd82418c41c3bbc4ea622895f17a"><li>这项目是干什么的：开源 AI coding agent 平台，72% SWE-bench，Docker 本地一键部署</li></ul><ul class="notion-list notion-list-disc notion-block-8cf62df7abbf494189f8ababa6934068"><li>为什么今天值得关注：69K stars；目前最成熟的开源 coding agent，可直接用于生产开发</li></ul><ul class="notion-list notion-list-disc notion-block-7947e72d92844a4ca7780236d6398f71"><li>与我的相关性：直接加速攀岩 app 开发；学习 agent workflow 架构设计</li></ul><ul class="notion-list notion-list-disc notion-block-6f11b74d605e4282b1098f7c3041cbd1"><li>上手成本：低（Docker 一键部署）</li></ul><ul class="notion-list notion-list-disc notion-block-db0eb3c38f3549c2a5d33224e9a39953"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-bdabc56bcfd44e65a44439226506b45f"><li>是否建议我复现：强烈建议——本周就部署</li></ul><ul class="notion-list notion-list-disc notion-block-ff335724937344c8b008041b4aa7653a"><li>一句话判断：不需要等，直接用，今天部署</li></ul><div class="notion-text notion-block-5a0e7180dbd84df0808da7bccda25bc3"><b>Repo 4：kyrolabs/awesome-agents</b></div><ul class="notion-list notion-list-disc notion-block-b85b23717d2b4221bbf82356d01d3409"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/kyrolabs/awesome-agents">https://github.com/kyrolabs/awesome-agents</a></li></ul><ul class="notion-list notion-list-disc notion-block-284b64e4c4824bb9987dd2cd9ee6423d"><li>方向标签：agent / curated</li></ul><ul class="notion-list notion-list-disc notion-block-92bcb43bc81042a791285eb40b6d959f"><li>这项目是干什么的：AI Agent 精选列表，持续维护，覆盖 framework、product、research</li></ul><ul class="notion-list notion-list-disc notion-block-5f177aa318c5436e8f57748330fea436"><li>为什么今天值得关注：补充 awesome-ai-agents-2026，两个列表侧重不同（这个更偏产品层）</li></ul><ul class="notion-list notion-list-disc notion-block-d6ac374f4f6643b69402c5a044fbd3d3"><li>与我的相关性：找攀岩 app 可参考的 agent 产品案例</li></ul><ul class="notion-list notion-list-disc notion-block-7f42878e7b9a4ef49029b8253baa4447"><li>上手成本：低</li></ul><ul class="notion-list notion-list-disc notion-block-c8ba5484dbe8408983269a914573ba88"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-6e4de41f94104e55a786615c515e6fed"><li>是否建议我复现：否</li></ul><ul class="notion-list notion-list-disc notion-block-96c4d3a6bda04409b38a7f54f3a0dd5a"><li>一句话判断：与 caramaschiHG 配套使用，覆盖更全面</li></ul><div class="notion-text notion-block-768218436c044e33aa878935d5bc9e16"><b>Repo 5：microsoft/bitnet-b1.58-2B-4T（HuggingFace 模型）</b></div><ul class="notion-list notion-list-disc notion-block-cc0888e078534d3c82f00dc2e8cef503"><li>HuggingFace 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/microsoft/bitnet-b1.58-2B-4T">https://huggingface.co/microsoft/bitnet-b1.58-2B-4T</a></li></ul><ul class="notion-list notion-list-disc notion-block-e00cbac8945443298d018e888cf0965d"><li>方向标签：deployment / edge / model</li></ul><ul class="notion-list notion-list-disc notion-block-06b505c5235c43a988fe91fe19adc6c1"><li>这项目是干什么的：Microsoft 官方 BitNet b1.58 2B 模型权重，可直接用 BitNet.cpp 框架加载运行</li></ul><ul class="notion-list notion-list-disc notion-block-e8177490cd614e158156476b6b2f0056"><li>为什么今天值得关注：有了模型权重才能真正体验 1-bit LLM；2B 参数对个人 laptop 友好</li></ul><ul class="notion-list notion-list-disc notion-block-49022bbdff7e4fa58b4d539b5f50fafa"><li>与我的相关性：手机端或轻量服务器部署的起点实验模型</li></ul><ul class="notion-list notion-list-disc notion-block-dccee64f8b924049be8597add8feeeed"><li>上手成本：低（BitNet.cpp 文档完整）</li></ul><ul class="notion-list notion-list-disc notion-block-3425221858244102ade899a1a377b7e8"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-af56a6e8a18c45e1a968ab1403383811"><li>是否建议我复现：可以，作为 edge 部署入门实验</li></ul><ul class="notion-list notion-list-disc notion-block-12a3c5f89453481f8127d8f7787a6e52"><li>一句话判断：BitNet.cpp 的配套模型，验证 edge inference 可行性的最直接起点</li></ul><div class="notion-text notion-block-f674f4a7fc3e44c78dacc9a985b74b05"><b>Repo 6：codelion/OpenEvolve（AlphaEvolve 开源实现）</b></div><ul class="notion-list notion-list-disc notion-block-fff1ff66336145d2b91096bca62ddadd"><li>HuggingFace 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/blog/codelion/openevolve">https://huggingface.co/blog/codelion/openevolve</a></li></ul><ul class="notion-list notion-list-disc notion-block-5455df426ec7404389b65f273ae39f29"><li>方向标签：agent / research / algorithm-discovery</li></ul><ul class="notion-list notion-list-disc notion-block-2ee5b1b427954498ada354211a5ca0d2"><li>这项目是干什么的：AlphaEvolve 的开源复现，LLM + 进化算法做 algorithm discovery 和代码优化</li></ul><ul class="notion-list notion-list-disc notion-block-cb5120005d6b484ca284e21ea3031be1"><li>为什么今天值得关注：Google AlphaEvolve 扩展访问同期，社区出现可复现实现</li></ul><ul class="notion-list notion-list-disc notion-block-7ab83f9aa042436ca49076cf80227ac1"><li>与我的相关性：中——用于自动优化攀岩 pose 检测算法；更偏研究向</li></ul><ul class="notion-list notion-list-disc notion-block-c8c1cc60e68e4359a6b4862208d44fc3"><li>上手成本：高（进化算法背景 + 复杂 agent 设计）</li></ul><ul class="notion-list notion-list-disc notion-block-2e2e9f6250ad4e79b632556221a97743"><li>是否建议我收藏：是（了解原理）</li></ul><ul class="notion-list notion-list-disc notion-block-649ae95dde4745ee830deb389ee9aaef"><li>是否建议我复现：暂不——先了解思路</li></ul><ul class="notion-list notion-list-disc notion-block-29e0f971fd95456c8b13d9b412c44d2e"><li>一句话判断：AlphaEvolve 民主化的开始，值得关注但暂不深入</li></ul><hr class="notion-hr notion-block-848d83654ec444baa502f8b895c49569"/><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-d22c767fa59a48bb94206ad24fbac245" data-id="d22c767fa59a48bb94206ad24fbac245"><span><div id="d22c767fa59a48bb94206ad24fbac245" class="notion-header-anchor"></div><a class="notion-hash-link" href="#d22c767fa59a48bb94206ad24fbac245" title="四、今日最值得看的 3 个链接"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">四、今日最值得看的 3 个链接</span></span></h3><div class="notion-text notion-block-5e6a5d288b354c2c807010f6d21d39e7"><b>🥇 第一优先：arXiv:2505.12854——攀岩 Hold 检测数据集</b></div><div class="notion-text notion-block-0a4847a1331848d39dc8c0dd0f3f1e9f"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2505.12854">https://arxiv.org/abs/2505.12854</a></div><div class="notion-text notion-block-3729f3f784c9482ca3a9d74ace447bf6">为什么：攀岩 AI 数据集极度稀缺，这个专门针对 hold 使用检测的数据集直接填补你 app 的核心功能缺口。读完你就知道数据集是否可获取、标注方式是什么、keypoint-overlap 检测方法如何迁移。配合昨日的 ClimbingCap，构成你 app 最重要的两个数据来源。今天就读。</div><div class="notion-text notion-block-1797c7399c4e40b7b13b1ec8c6ecaee4"><b>🥈 第二优先：Windsurf Wave 13 changelog + 实际上手</b></div><div class="notion-text notion-block-290ec021888a43faa929076cbcf4bc4e"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://windsurf.com/changelog">https://windsurf.com/changelog</a></div><div class="notion-text notion-block-65b023c6c907461cb0d50cea9ae7c48a">为什么：SWE-1.5 免费开放至月底，并行 Agent + Git Worktree 是本周编码工具最大升级。这是限时机会。今天就切换到 Wave 13，在攀岩 app 项目中开启 2-3 个并行 agent，感受实际工作流，同时作为项目经历记录下来。</div><div class="notion-text notion-block-d5f9be7d76044f39926c7ac771ed4127"><b>🥉 第三优先：Belay AI 竞品体验（</b><b><a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://belay.ai">belay.ai</a></b><b>）</b></div><div class="notion-text notion-block-d90de801a82442578a0152505c400c6f"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://belay.ai">https://belay.ai</a></div><div class="notion-text notion-block-8f01784adfaa4239ac014f22c0824dc1">为什么：这是最接近你 app 方向的商业竞品。30 分钟的竞品体验价值远超读一篇论文——直接告诉你差异化机会在哪。注册账号，上传一段攀岩视频，记录功能和体验短板，写进项目的 motivation 部分。</div><hr class="notion-hr notion-block-ec4449a3fb364117bc8641bde93a64de"/><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-9c4c5d7f29404cf89f4b7e33e3c988b0" data-id="9c4c5d7f29404cf89f4b7e33e3c988b0"><span><div id="9c4c5d7f29404cf89f4b7e33e3c988b0" class="notion-header-anchor"></div><a class="notion-hash-link" href="#9c4c5d7f29404cf89f4b7e33e3c988b0" title="五、今日行动清单"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">五、今日行动清单</span></span></h3><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-76baa31e7561402daa4be6ce00f41b19" data-id="76baa31e7561402daa4be6ce00f41b19"><span><div id="76baa31e7561402daa4be6ce00f41b19" class="notion-header-anchor"></div><a class="notion-hash-link" href="#76baa31e7561402daa4be6ce00f41b19" title="1. 今天值得收藏但不必立刻看的"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">1. 今天值得收藏但不必立刻看的</span></span></h4><ul class="notion-list notion-list-disc notion-block-8379948df0ba4db8a7c7564ce4b797a6"><li>MotionLLM 论文 — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2405.20340">https://arxiv.org/abs/2405.20340</a></li></ul><ul class="notion-list notion-list-disc notion-block-7ee784a1a82f4b4db65197febb020ca5"><li>caramaschiHG/awesome-ai-agents-2026 — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/caramaschiHG/awesome-ai-agents-2026">https://github.com/caramaschiHG/awesome-ai-agents-2026</a></li></ul><ul class="notion-list notion-list-disc notion-block-089f55d943704dbcbe7627cb3f632434"><li>codelion/OpenEvolve — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://huggingface.co/blog/codelion/openevolve">https://huggingface.co/blog/codelion/openevolve</a></li></ul><ul class="notion-list notion-list-disc notion-block-483397e57c734d3299645e9f8aafab6f"><li>AI Dev Tool 2026 Power Rankings — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://blog.logrocket.com/ai-dev-tool-power-rankings/">https://blog.logrocket.com/ai-dev-tool-power-rankings/</a></li></ul><ul class="notion-list notion-list-disc notion-block-f5e272c2ebf54a69bbf501a6ce612c20"><li>Climbing Technique Evaluation via Skeleton Video — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://www.mdpi.com/1424-8220/23/19/8216">https://www.mdpi.com/1424-8220/23/19/8216</a></li></ul><ul class="notion-list notion-list-disc notion-block-7ec59c7076354eb697e3730276192c19"><li>TechCrunch 2026 AI 实用主义转型 — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://techcrunch.com/2026/01/02/in-2026-ai-will-move-from-hype-to-pragmatism/">https://techcrunch.com/2026/01/02/in-2026-ai-will-move-from-hype-to-pragmatism/</a></li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-5f5e09dbc37445e882ad3dd65a3c0be6" data-id="5f5e09dbc37445e882ad3dd65a3c0be6"><span><div id="5f5e09dbc37445e882ad3dd65a3c0be6" class="notion-header-anchor"></div><a class="notion-hash-link" href="#5f5e09dbc37445e882ad3dd65a3c0be6" title="2. 今天值得精读的"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">2. 今天值得精读的</span></span></h4><ul class="notion-list notion-list-disc notion-block-163e1eacb59645cc9739b92944965159"><li><b>「The Way Up」arXiv:2505.12854</b>——重点看数据集规模、标注方式、keypoint-overlap 检测方法</li></ul><ul class="notion-list notion-list-disc notion-block-e3ce560a040442c887d923a9ac55d9b4"><li><b>Addison Osmani LLM Coding Workflow</b> — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://addyosmani.com/blog/ai-coding-workflow/">https://addyosmani.com/blog/ai-coding-workflow/</a></li></ul><ul class="notion-list notion-list-disc notion-block-c4b1449a2f90449e9bc2b86da6d1f00c"><li><b>Windsurf Wave 13 changelog</b>——重点看并行 agent 和 Git Worktree 的配置方式</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-0e62d7999e524b4295fe65528dc76f4b" data-id="0e62d7999e524b4295fe65528dc76f4b"><span><div id="0e62d7999e524b4295fe65528dc76f4b" class="notion-header-anchor"></div><a class="notion-hash-link" href="#0e62d7999e524b4295fe65528dc76f4b" title="3. 今天值得复现 / 试用的"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">3. 今天值得复现 / 试用的</span></span></h4><ul class="notion-list notion-list-disc notion-block-5df430466c0040ae932e26df068c1e5c"><li><b>Windsurf Wave 13 上手</b>：切换到最新版，在攀岩 app 项目中开启 2-3 个并行 agent，记录 workflow</li></ul><ul class="notion-list notion-list-disc notion-block-db4acee95dc74bd7b5264b6ee34b1c13"><li><b>Belay AI 竞品体验</b>：注册账号，上传攀岩视频，记录功能和体验短板（30 分钟）</li></ul><ul class="notion-list notion-list-disc notion-block-b96260b9c98f49418b3584f47f1678bb"><li><b>BitNet.cpp demo</b>（可选）：下载 bitnet-b1.58-2B-4T，本地跑推理验证速度</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-5a04e9cd85c24b559e48b64a89404fab" data-id="5a04e9cd85c24b559e48b64a89404fab"><span><div id="5a04e9cd85c24b559e48b64a89404fab" class="notion-header-anchor"></div><a class="notion-hash-link" href="#5a04e9cd85c24b559e48b64a89404fab" title="4. 今天值得记到项目 Roadmap 的"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">4. 今天值得记到项目 Roadmap 的</span></span></h4><ul class="notion-list notion-list-disc notion-block-3a5ff132d6ec4f6c867f8257ab87f4da"><li><b>Hold 检测模块</b>：基于「The Way Up」数据集，加入 hold-level 分析作为 app 的差异化功能</li></ul><ul class="notion-list notion-list-disc notion-block-c9c7a995b7b04f06929469c038a69281"><li><b>竞品调研</b>：Belay AI 功能缺口 → 你的 app 差异化机会点（写进 README 的 motivation 部分）</li></ul><ul class="notion-list notion-list-disc notion-block-2d448a82e3f0432ba8f3e5684e54ca13"><li><b>并行开发 workflow</b>：用 Windsurf Wave 13 并行 agent 推进攀岩 app 多模块，记录 workflow 作为项目经历</li></ul><ul class="notion-list notion-list-disc notion-block-99d4752c0f784ee3810f499403747d60"><li><b>LLM 选型 v2</b>：Gemini 3.1 Flash-Lite（$0.25/M tokens）纳入视频帧批量分析成本模型</li></ul><ul class="notion-list notion-list-disc notion-block-53544d29eeaf4d409db761a8f4c86f70"><li><b>Edge 部署备选</b>：BitNet.cpp + bitnet-b1.58-2B-4T 列为手机端长期部署技术路径</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-5c6e42d9a22d421f824552927ebbe0d4" data-id="5c6e42d9a22d421f824552927ebbe0d4"><span><div id="5c6e42d9a22d421f824552927ebbe0d4" class="notion-header-anchor"></div><a class="notion-hash-link" href="#5c6e42d9a22d421f824552927ebbe0d4" title="5. 今天面试里可以拿来讲的 1-2 个点"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">5. 今天面试里可以拿来讲的 1-2 个点</span></span></h4><div class="notion-text notion-block-8a20447e8d994a1a80ec118531e0d2f0"><b>点 1（产品 + 技术双视角）：</b>「我构建攀岩动作分析 app 时，不只是在写代码——我做了系统性的竞品调研（Belay AI）和学术调研（The Way Up 数据集、ClimbingCap CVPR 2025）。竞品调研发现商业产品在 hold-level feedback 颗粒度上有明显不足，学术调研找到了可填补这个缺口的数据集。这让我的项目有了清晰的差异化方向：从数据驱动的 hold 使用分析切入。」——展示：主动 market research + domain-specific research 能力</div><div class="notion-text notion-block-31dc3f66d26d4f1dbd01dd8b6893b48f"><b>点 2（AI 工程实践）：</b>「我在项目开发中使用了 Windsurf Wave 13 的并行 agent 功能，同时运行多个 coding agent 分别开发 pose estimation 模块、feedback 生成模块和前端 UI，通过 Git Worktree 隔离避免分支冲突。这让我对 multi-agent coordination 的实际挑战有了第一手理解——不只是理论上知道 agent，而是真正设计了 multi-agent workflow 并解决了实际问题。」——展示：agentic 工程实践经验</div></main></div>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[AI 日报 | 2026-03-24]]></title>
            <link>https://dundun0504.com/article/32d670e5-5499-8177-b5e5-c8ec56805534</link>
            <guid>https://dundun0504.com/article/32d670e5-5499-8177-b5e5-c8ec56805534</guid>
            <pubDate>Tue, 24 Mar 2026 00:00:00 GMT</pubDate>
            <description><![CDATA[2026-03-24 AI 技术日报：攀岩反馈 AI 论文双发直接命中 app 方向、BitNet.cpp CPU 边缘推理 HN 370 Points、Helium agentic serving 优化论文、OpenHands coding agent 成熟可用、三巨头模型格局固化。]]></description>
            <content:encoded><![CDATA[<div id="notion-article" class="mx-auto overflow-hidden "><main class="notion light-mode notion-page notion-block-32d670e554998177b5e5c8ec56805534"><div class="notion-viewport"></div><div class="notion-collection-page-properties"></div><blockquote class="notion-quote notion-block-9762e3cce0574d549b30d980132eda5a"><div>📋 <b>今日亮点</b>：攀岩 AI 论文双发（直接命中你的 app）；BitNet.cpp CPU 边缘推理持续热议；OpenHands 是目前最成熟开源 coding agent；Helium 提供 agentic serving 新思路。优先看第一、三、五节。</div></blockquote><hr class="notion-hr notion-block-8e15c172ccee490985ae660a1102a29f"/><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-a3f2e58ffcdd4174aea2ccbf5c34479f" data-id="a3f2e58ffcdd4174aea2ccbf5c34479f"><span><div id="a3f2e58ffcdd4174aea2ccbf5c34479f" class="notion-header-anchor"></div><a class="notion-hash-link" href="#a3f2e58ffcdd4174aea2ccbf5c34479f" title="一、今日最重要的 5 条"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">一、今日最重要的 5 条</span></span></h3><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-381d31cb94a0477d87c360c75bb2fc86" data-id="381d31cb94a0477d87c360c75bb2fc86"><span><div id="381d31cb94a0477d87c360c75bb2fc86" class="notion-header-anchor"></div><a class="notion-hash-link" href="#381d31cb94a0477d87c360c75bb2fc86" title="1. 🔥 攀岩反馈生成论文 + ClimbingCap 双发——你的 app 有直接学术背书"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">1. 🔥 攀岩反馈生成论文 + ClimbingCap 双发——你的 app 有直接学术背书</span></span></h4><div class="notion-text notion-block-ad12bed8a92845a59cd0607c15d6a0d3"><b>发生了什么：</b> 两篇直接针对攀岩 AI 的论文同期出现：</div><div class="notion-text notion-block-79d42567ac234976825c32d519877d2e">① <b>arXiv:2602.08996</b>「Generalizing Sports Feedback Generation by Watching Competitions and Reading Books: A Rock Climbing Case Study」(2026-02-09)：研究如何用 Video-LLM + 竞赛视频 + 教练手册生成攀岩动作反馈。提出用免费网络资源 + 跨域迁移解决标注数据稀缺问题；指出 BLEU/ROUGE 不适合运动反馈评估，需设计专用指标。</div><div class="notion-text notion-block-0d9f2d73e9df4e2c8710ae54250d6f85">② <b>ClimbingCap (arXiv:2503.21268, CVPR 2025)</b>：AscendMotion 数据集，412K 帧 RGB+LiDAR+IMU，22 名攀岩教练，12 堵岩壁；提出 world coordinate 下的 3D 攀岩动作重建方法。</div><div class="notion-text notion-block-2c07f19302b64dfb83f1275c2b4cd62f"><b>为什么重要：</b> 完整覆盖「上传视频 → 识别动作 → 提供改进建议」pipeline，是你的 app 最重要的学术参考。</div><div class="notion-text notion-block-188929ce127d4366b7ac2a2fcac53950">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2602.08996">https://arxiv.org/abs/2602.08996</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2503.21268">https://arxiv.org/abs/2503.21268</a></div><hr class="notion-hr notion-block-a7b9b286f23c4a5d9a1a6f722b3a7e0f"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-3e396eae60a546fa939f9e9ff14eeda7" data-id="3e396eae60a546fa939f9e9ff14eeda7"><span><div id="3e396eae60a546fa939f9e9ff14eeda7" class="notion-header-anchor"></div><a class="notion-hash-link" href="#3e396eae60a546fa939f9e9ff14eeda7" title="2. 🔥 Microsoft BitNet.cpp：HN 370 Points，100B 模型跑在单 CPU 上"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">2. 🔥 Microsoft BitNet.cpp：HN 370 Points，100B 模型跑在单 CPU 上</span></span></h4><div class="notion-text notion-block-3d415dbea0414a4ca1ff9f4a26ad5460"><b>发生了什么：</b> BitNet.cpp 本周 HN 370 points、169 条评论，3 月持续 GitHub trending。在单 CPU 运行 100B 参数模型，速度 5-7 tokens/sec（接近阅读速度），ARM CPU 加速 1.37x-5.07x，能耗降低 55-82%。核心社区争论：「1-bit 模型在哪些任务已经够用了？」</div><div class="notion-text notion-block-cb672dff35364152bdce425db16dff6f"><b>为什么重要：</b> Edge/mobile 部署门槛大幅降低，GPU-free AI 开始从理论走向实际。</div><div class="notion-text notion-block-ca4933e19f954a9eaf7a082811eb5858"><b>对你的关系：</b> 攀岩 app 的 mobile 部署路径有了具体技术选型参考。</div><div class="notion-text notion-block-89c4c05256404caba01c724a322d91ee">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/microsoft/BitNet">https://github.com/microsoft/BitNet</a></div><hr class="notion-hr notion-block-d8494612b2c6416ab74daf06012ff565"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-916a62e6a4974588b78f2c1e6509d0a0" data-id="916a62e6a4974588b78f2c1e6509d0a0"><span><div id="916a62e6a4974588b78f2c1e6509d0a0" class="notion-header-anchor"></div><a class="notion-hash-link" href="#916a62e6a4974588b78f2c1e6509d0a0" title="3. 🔥 Helium (arXiv:2603.16104)：Agent Workflow 的 LLM Serving 新思路"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">3. 🔥 Helium (arXiv:2603.16104)：Agent Workflow 的 LLM Serving 新思路</span></span></h4><div class="notion-text notion-block-b0351419e34a4b018c453facce43560c"><b>发生了什么：</b> 2026-03-17 发表。把 multi-step agentic workflow 建模为「查询计划」，LLM 调用为「算子」，通过 proactive KV caching + cache-aware scheduling，比 vLLM 最高实现 1.56x 加速。</div><div class="notion-text notion-block-31825fff527a41ba928a2f0192b2ee35"><b>为什么重要：</b> 首批从 workflow 视角做 LLM serving 优化的系统论文；现有 serving 系统（vLLM）只优化单次 call，无法利用 multi-step 调用间的结构性依赖。</div><div class="notion-text notion-block-6ef7ade1cfa8488fa1723e324f5867a4"><b>面试价值：</b> 可以讲「为什么 vLLM 对 agentic 场景效率不足，以及 data systems 视角如何解决」。</div><div class="notion-text notion-block-c67eda48e07144158f395416e7e03cdc">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2603.16104">https://arxiv.org/abs/2603.16104</a></div><hr class="notion-hr notion-block-9cc03ccc873b46ffb02c3d2bcb5b4ea4"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-01412d4e357040c3a40e3032d76e1092" data-id="01412d4e357040c3a40e3032d76e1092"><span><div id="01412d4e357040c3a40e3032d76e1092" class="notion-header-anchor"></div><a class="notion-hash-link" href="#01412d4e357040c3a40e3032d76e1092" title="4. OpenHands 72% SWE-Bench Verified——开源 Coding Agent 成熟临界点"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">4. OpenHands 72% SWE-Bench Verified——开源 Coding Agent 成熟临界点</span></span></h4><div class="notion-text notion-block-06dcfd2418e24a12874b7b46ee4f3565"><b>发生了什么：</b> OpenHands（原 OpenDevin）用 Claude Sonnet 4.5 + extended thinking 达到 72% SWE-Bench Verified，69K stars，推出 OpenHands Index 多维度评估体系（issue resolution、greenfield development、frontend 等）。</div><div class="notion-text notion-block-f499569245314000b14757103e4e87ce"><b>为什么重要：</b> 目前最成熟的开源 coding agent 平台，Docker 本地部署，可直接用，显著加速开发效率。</div><div class="notion-text notion-block-241ef481101049c3a56483f5eae6de7e">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/OpenHands/OpenHands">https://github.com/OpenHands/OpenHands</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://openhands.dev/blog/openhands-index">https://openhands.dev/blog/openhands-index</a></div><hr class="notion-hr notion-block-e46d5e6aa5874988a0e75b66c23909b3"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-171819ce451741c193b638d15c6bad5c" data-id="171819ce451741c193b638d15c6bad5c"><span><div id="171819ce451741c193b638d15c6bad5c" class="notion-header-anchor"></div><a class="notion-hash-link" href="#171819ce451741c193b638d15c6bad5c" title="5. 前沿三巨头 GPT-5.4 / Claude Sonnet 4.6 / Gemini 3.1 Pro 格局固化"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">5. 前沿三巨头 GPT-5.4 / Claude Sonnet 4.6 / Gemini 3.1 Pro 格局固化</span></span></h4><div class="notion-text notion-block-c56751d0d6e540889c17be0c5c6b1e05"><b>发生了什么：</b> 三者 Artificial Analysis Intelligence Index 并列 57 分，差距极小。关键更新：GPT-5.4 原生 computer use（OSWorld-V 75%，人类基线 72.4%）；Claude Sonnet 4.6 1M context GA + memory 全量；Gemini 3.1 Flash-Lite 仅 $0.25/M tokens，速度快 2.5x。</div><div class="notion-text notion-block-d0ab3b73123d4cd9a3b7bea04a81badb"><b>结论：</b> 模型选择不再是关键差异化因素；Flash-Lite 极低价格对高频视频帧分析场景很有吸引力。</div><div class="notion-text notion-block-4eb1de4b82f4450aa239652a3f5fc24d">🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://llm-stats.com/llm-updates">https://llm-stats.com/llm-updates</a></div><hr class="notion-hr notion-block-6ed1f6dd5cdf4b64aa84afbee15e15b0"/><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-49a85910022c47f5b1540d9d7d84720a" data-id="49a85910022c47f5b1540d9d7d84720a"><span><div id="49a85910022c47f5b1540d9d7d84720a" class="notion-header-anchor"></div><a class="notion-hash-link" href="#49a85910022c47f5b1540d9d7d84720a" title="二、按目标分类"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">二、按目标分类</span></span></h3><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-eae429d3b0d945e4acc2a48e218fcc95" data-id="eae429d3b0d945e4acc2a48e218fcc95"><span><div id="eae429d3b0d945e4acc2a48e218fcc95" class="notion-header-anchor"></div><a class="notion-hash-link" href="#eae429d3b0d945e4acc2a48e218fcc95" title="A. 前沿模型 / 一手发布"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">A. 前沿模型 / 一手发布</span></span></h4><div class="notion-text notion-block-379362d59e294edb9151e1a5d0d362c1"><b>【GPT-5.4 原生 Computer Use + OSWorld-V 75%】</b></div><ul class="notion-list notion-list-disc notion-block-a4cf50c19ef44fb9b8363f457d5b7f7e"><li>事件：OpenAI GPT-5.4 发布，具备原生 computer use，可自主控制桌面完成多步工作流</li></ul><ul class="notion-list notion-list-disc notion-block-37b2a54852f3455c93b78ee7d6f3c53b"><li>核心内容：1M token 上下文；OSWorld-V 75%（人类基线 72.4%）；desktop productivity 任务超越人类</li></ul><ul class="notion-list notion-list-disc notion-block-9b5548ed336845b5a2f5b27c48eef1b0"><li>为什么重要：从「描述操作」到「真正执行操作」是质变；agentic 工程设计范式改变</li></ul><ul class="notion-list notion-list-disc notion-block-4a93c19d62214013bfaf5cd0c5a568bf"><li>我需不需要点开：需要——了解 computer use API，对 agent 工程有直接参考</li></ul><ul class="notion-list notion-list-disc notion-block-0cd10808c5ef4d9ca7df8fa05e106501"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://renovateqr.com/blog/ai-model-releases-2026">https://renovateqr.com/blog/ai-model-releases-2026</a></li></ul><div class="notion-text notion-block-188a1aced3c24dc09352b44081d507f6"><b>【Claude Sonnet 4.6：1M Context GA + Memory 全量】</b></div><ul class="notion-list notion-list-disc notion-block-4206de46c7b54a89972843141113c508"><li>事件：Anthropic 2026-02-17 发布；3 月全量推出跨对话 memory 功能</li></ul><ul class="notion-list notion-list-disc notion-block-a00d297c06534d5f97749bf15ec5a080"><li>核心内容：1M token 上下文正式 GA（不再 beta）；跨对话记忆持久化；coding、agent planning 能力提升</li></ul><ul class="notion-list notion-list-disc notion-block-3a0d78fbe6b747ef8cfc8eba47c35e56"><li>为什么重要：1M context 实用化，长视频 transcript 可 end-to-end 给 LLM，不再需要 chunking pipeline</li></ul><ul class="notion-list notion-list-disc notion-block-8a925d3035714c9d89899d15e45e10f8"><li>我需不需要点开：需要——直接影响你的视频分析 pipeline 架构选择</li></ul><ul class="notion-list notion-list-disc notion-block-b1f998f783bb455799d14112eb03d64f"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://renovateqr.com/blog/ai-model-releases-2026">https://renovateqr.com/blog/ai-model-releases-2026</a></li></ul><div class="notion-text notion-block-eebcd0d5a45f417689030cb1483f3a3b"><b>【Google AlphaEvolve：LLM + 进化算法，静默运行 Google 基础设施 1 年】</b></div><ul class="notion-list notion-list-disc notion-block-f7d3fdb032b64868aa27e2f8192a3f01"><li>事件：Google DeepMind 公开 AlphaEvolve，Gemini 驱动的 coding agent，内嵌进化算法</li></ul><ul class="notion-list notion-list-disc notion-block-361fdc9fbd2f4b818065ba2e1dcb44f7"><li>核心内容：已在 Google 内部运行 &gt;1 年；节省 0.7% 全球算力；Gemini kernel 加速 23%；数学上发现新结构</li></ul><ul class="notion-list notion-list-disc notion-block-83a5ac2ffb0f4fe18e99ee7aac9fd0d5"><li>为什么重要：「AI 优化 AI 自身基础设施」首次大规模验证，代表 AI 工程未来形态</li></ul><ul class="notion-list notion-list-disc notion-block-f416b142375545afb62455659fd71426"><li>我需不需要点开：了解即可，暂无公开可复现实现</li></ul><ul class="notion-list notion-list-disc notion-block-cf56003917324267a97e2be2fe197511"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://crescendo.ai/news/latest-ai-news-and-updates">https://crescendo.ai/news/latest-ai-news-and-updates</a></li></ul><div class="notion-text notion-block-a19e05a0ae294856994cf5c0c5d3bb73"><b>【Gemini 3.1 Flash-Lite：$0.25/M tokens，速度快 2.5x】</b></div><ul class="notion-list notion-list-disc notion-block-cbef4f8acb734b7e8aa3239b26e631dd"><li>事件：Google 发布效率导向新品，面向高频调用场景</li></ul><ul class="notion-list notion-list-disc notion-block-9ad9d10f0bc04bcf81a221a6db8c8c17"><li>核心内容：比前代快 2.5x，输出速度快 45%，价格仅 $0.25/M input tokens</li></ul><ul class="notion-list notion-list-disc notion-block-33878dc3417e4b9498db5b2e85c67277"><li>为什么重要：对高频调用 app（视频帧批量分析）极具性价比</li></ul><ul class="notion-list notion-list-disc notion-block-3554a3e040fd457c82cd49373bf5dd01"><li>我需不需要点开：值得关注定价；攀岩 app 视频帧批量分析的直接成本优化选项</li></ul><ul class="notion-list notion-list-disc notion-block-c502e35c212c44a3b1ecd20ae9ece36c"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://llm-stats.com/llm-updates">https://llm-stats.com/llm-updates</a></li></ul><hr class="notion-hr notion-block-673fc4839bbf47eca5520bdf75467225"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-13d774b5eef14dfe8ba74ea303600868" data-id="13d774b5eef14dfe8ba74ea303600868"><span><div id="13d774b5eef14dfe8ba74ea303600868" class="notion-header-anchor"></div><a class="notion-hash-link" href="#13d774b5eef14dfe8ba74ea303600868" title="B. AI 工程 / Agent / Coding Workflow"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">B. AI 工程 / Agent / Coding Workflow</span></span></h4><div class="notion-text notion-block-2ca010443bba481584d9ac9dcd81d165"><b>【Helium: Efficient LLM Serving for Agentic Workflows (arXiv:2603.16104)】</b></div><ul class="notion-list notion-list-disc notion-block-6c53a8a4396b4f7cbd7f505cc520841b"><li>内容：把 agentic workflow 建模为查询计划，LLM 调用为算子，proactive KV caching + cache-aware scheduling，比 vLLM 最高快 1.56x</li></ul><ul class="notion-list notion-list-disc notion-block-885f0bee170e410dabc944dc841f3822"><li>可落地价值：减少 multi-step agent pipeline 的 latency/cost；适合「视频上传→转录→多步分析」的攀岩 app workflow</li></ul><ul class="notion-list notion-list-disc notion-block-786f3f7966a949aeb6d8ee372f046989"><li>对我当前开发/学习的意义：理解 agent serving 系统设计；面试可讲「vLLM 对 agentic 场景不够优化的原因」</li></ul><ul class="notion-list notion-list-disc notion-block-6e99b08073d541f7b13cbaab6b5fa7af"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2603.16104">https://arxiv.org/abs/2603.16104</a></li></ul><div class="notion-text notion-block-07b05355bf384519abce3b972a544c10"><b>【OpenHands 开源 Coding Agent（69K ⭐，72% SWE-Bench）】</b></div><ul class="notion-list notion-list-disc notion-block-51683b4661444fff9e7aa6c5a1912112"><li>内容：目前最成熟的开源 coding agent 平台，Docker 本地部署，支持多种 LLM 后端，有完整 eval 体系</li></ul><ul class="notion-list notion-list-disc notion-block-d156724104d140d48a0ec551fb62d6af"><li>可落地价值：直接加速开发效率；让 agent 替你写代码、修 bug、跑测试</li></ul><ul class="notion-list notion-list-disc notion-block-b074bffd3c0b4504a90f0ed1025bede5"><li>对我当前开发/学习的意义：本周就跑起来；同时学习其 agent workflow 架构设计作为面试素材</li></ul><ul class="notion-list notion-list-disc notion-block-8542581cd3994d22950ff8a8b5c1fec4"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/OpenHands/OpenHands">https://github.com/OpenHands/OpenHands</a></li></ul><div class="notion-text notion-block-278c4926c3df48ea97fd8c5f49f0f0fa"><b>【Coding Agents 在 GitHub 渗透率已达 15-22%（arXiv:2601.18341）】</b></div><ul class="notion-list notion-list-disc notion-block-657c7f85163248c2ae990b143c80751d"><li>内容：大规模研究 129,134 个项目，coding agent 使用率 15.85–22.60%，且仍在增长</li></ul><ul class="notion-list notion-list-disc notion-block-3e45e5aa04f4482bb5bf27c52b10be96"><li>可落地价值：确认「日常开发使用 coding agent」已是行业实践，非前沿研究</li></ul><ul class="notion-list notion-list-disc notion-block-7cb1cbf7cb344b9a902c9a1312dafa1f"><li>对我当前开发/学习的意义：简历/面试中表达「我使用 coding agent 提升开发效率」是正确的职业定位</li></ul><ul class="notion-list notion-list-disc notion-block-e1a06107d2ce4fd0aaeacc7447d5a0eb"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2601.18341">https://arxiv.org/abs/2601.18341</a></li></ul><div class="notion-text notion-block-b88cc091211a4eb0afdd644667da2d20"><b>【HyEvo: Self-Evolving Hybrid Agentic Workflows (arXiv:2603.19639)】</b></div><ul class="notion-list notion-list-disc notion-block-eb18c55f06794f1ab4c94b13ed59a7c4"><li>内容：LLM agent 在推理时自动演化 workflow 结构（混合 CoT + tool use），减少人工 prompt 设计成本</li></ul><ul class="notion-list notion-list-disc notion-block-4c6dbb9ad12948c0a07a8f2b50489cfd"><li>可落地价值：为 multi-step reasoning agent 提供 self-optimizing 思路</li></ul><ul class="notion-list notion-list-disc notion-block-4221f2653b844813ac6ec04ab399fa78"><li>对我当前开发/学习的意义：设计攀岩分析 agent pipeline 时可参考 self-evolving workflow 的架构思想</li></ul><ul class="notion-list notion-list-disc notion-block-46b4caef199543b0b8f6519f0c75f9d5"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2603.19639">https://arxiv.org/abs/2603.19639</a></li></ul><hr class="notion-hr notion-block-85d32deb4dfb415f8a1fadb55cb6f277"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-a006bc0ffccf4b8eb773fb5b36a71b42" data-id="a006bc0ffccf4b8eb773fb5b36a71b42"><span><div id="a006bc0ffccf4b8eb773fb5b36a71b42" class="notion-header-anchor"></div><a class="notion-hash-link" href="#a006bc0ffccf4b8eb773fb5b36a71b42" title="C. 视觉 / 视频 / 运动人体分析"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">C. 视觉 / 视频 / 运动人体分析</span></span></h4><div class="notion-text notion-block-3ee0ef6c65114e918474e80931636e32"><b>【⭐ 高优先级】「Generalizing Sports Feedback Generation: A Rock Climbing Case Study」(arXiv:2602.08996)</b></div><ul class="notion-list notion-list-disc notion-block-3db9b544fb6f4ce89d3b13462c3815f4"><li>内容：Video-LLM 在运动反馈生成专项研究，攀岩为 case study。用竞赛视频+教练手册+跨域 feedback 迁移解决标注稀缺；指出 BLEU/ROUGE/BERTScore 均不适合运动反馈评估</li></ul><ul class="notion-list notion-list-disc notion-block-7091e448fce7453fbbc6692e8a56737e"><li>与「攀岩动作分析 app」的相关性：极高。直接研究「Video-LLM 给攀岩视频提供动作改进建议」，和你的 app 核心功能一模一样</li></ul><ul class="notion-list notion-list-disc notion-block-49b015ffab144fa7a67aa01ae3e84ab5"><li>可迁移到项目的点：① 用 YouTube 比赛视频+教练手册作为辅助训练数据（免费可获取）；② 跨域迁移策略（从有更多数据的运动迁移到攀岩）；③ 需要设计专用 evaluation metric</li></ul><ul class="notion-list notion-list-disc notion-block-58cbaf72c0654081ae4f53e602fb26dd"><li>优先级：高——今天就读</li></ul><ul class="notion-list notion-list-disc notion-block-665eadedd8324bd186b5f9b1c720ca80"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2602.08996">https://arxiv.org/abs/2602.08996</a></li></ul><div class="notion-text notion-block-58a64fbcaa16480b82a84e3f41c1f426"><b>【⭐ 高优先级】ClimbingCap (arXiv:2503.21268, CVPR 2025)</b></div><ul class="notion-list notion-list-disc notion-block-a6e938129ee744048f39c974f7fd5b65"><li>内容：AscendMotion 数据集，412K RGB+LiDAR+IMU 帧，22 名攀岩教练，12 堵岩壁；world coordinate 下的 3D 攀岩动作重建；semi-supervised training 策略</li></ul><ul class="notion-list notion-list-disc notion-block-c5b4124cc1db48fa8b1c80f132d16f3d"><li>与「攀岩动作分析 app」的相关性：高。目前最完整的攀岩运动捕捉数据集，CVPR 级别学术背书</li></ul><ul class="notion-list notion-list-disc notion-block-d5241de472604cf09d558f23cc1d918b"><li>可迁移到项目的点：① 数据集可能公开（项目主页已上线）；② RGB-only 方案可简化（不依赖 LiDAR）；③ semi-supervised training 对数据少的场景有价值</li></ul><ul class="notion-list notion-list-disc notion-block-e17c12cb75584bc3bc59f7d5872235a1"><li>优先级：高——今天就读，看数据集是否可申请</li></ul><ul class="notion-list notion-list-disc notion-block-a4d2b7f13e184c3eaba4bd4affc76cb6"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2503.21268">https://arxiv.org/abs/2503.21268</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://www.lidarhumanmotion.net/climbingcap/">http://www.lidarhumanmotion.net/climbingcap/</a></li></ul><div class="notion-text notion-block-061cd21857d347829ad106aa29b055b1"><b>【中优先级】Commercial Vision Sensors + AI Pose Estimation for Sports (PMC 2026)</b></div><ul class="notion-list notion-list-disc notion-block-b0ec7baf24bb43b2a639d78caf5a78fd"><li>内容：商业视觉传感器（iPhone 等）+ AI 姿态估计在运动健身场景的 mini review，覆盖 markerless motion analysis</li></ul><ul class="notion-list notion-list-disc notion-block-cd2a4549b9dc4612a28c78bfcddebb9b"><li>与「攀岩动作分析 app」的相关性：中。提供「手机摄像头做 markerless motion analysis」的实用方案综述</li></ul><ul class="notion-list notion-list-disc notion-block-6c2079025ec84bd0a66d23c0b8cd6e39"><li>可迁移到项目的点：了解 MediaPipe、DensePose 在实际运动场景的适用性和精度边界；手机端 pose estimation 的现实限制</li></ul><ul class="notion-list notion-list-disc notion-block-6ea55e09f71a49c18040e7be384a8f09"><li>优先级：中——收藏备查</li></ul><ul class="notion-list notion-list-disc notion-block-e27ccd19eb38456a9f05f35938bf324e"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://pmc.ncbi.nlm.nih.gov/articles/PMC12378739/">https://pmc.ncbi.nlm.nih.gov/articles/PMC12378739/</a></li></ul><div class="notion-text notion-block-5e235a3c07f14a09994214ff6ad69eee"><b>【中优先级】ML for Climbing Move Sequence Visualization (arXiv:2503.00458)</b></div><ul class="notion-list notion-list-disc notion-block-ab68c68f4d324f06b427ed9309c70c01"><li>内容：用 ML 对攀岩 boulder problem 移动序列进行可视化和生成（2025-03）</li></ul><ul class="notion-list notion-list-disc notion-block-ab06e70dca6141b3a8192acb8166e61e"><li>与「攀岩动作分析 app」的相关性：中。路线序列可视化可作为 app 的一个功能模块</li></ul><ul class="notion-list notion-list-disc notion-block-279450f5447b444ca7e3730f46d914e5"><li>可迁移到项目的点：攀岩路线 hold 序列的自动分析和生成</li></ul><ul class="notion-list notion-list-disc notion-block-b41aa1c3feb344d2b4fc62c429d9010b"><li>优先级：中</li></ul><ul class="notion-list notion-list-disc notion-block-628ee830c1304c4fa8467069f670a147"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2503.00458">https://arxiv.org/abs/2503.00458</a></li></ul><hr class="notion-hr notion-block-6b756d3b85c5419aa24823a8cf61c914"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-0ec05a79c5b345b39f624cf7b11f27e6" data-id="0ec05a79c5b345b39f624cf7b11f27e6"><span><div id="0ec05a79c5b345b39f624cf7b11f27e6" class="notion-header-anchor"></div><a class="notion-hash-link" href="#0ec05a79c5b345b39f624cf7b11f27e6" title="D. 产品化 / 商业化 / 行业动态"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">D. 产品化 / 商业化 / 行业动态</span></span></h4><div class="notion-text notion-block-fbc0dfef214043beac5b7b8dea485544"><b>【OpenAI 年化收入超 $250 亿，启动 IPO 准备】</b></div><ul class="notion-list notion-list-disc notion-block-82a6ec0c8c314b36a07bc77f237eb84d"><li>动态：OpenAI 年化收入超 $250 亿，最早可能 2026 年底上市</li></ul><ul class="notion-list notion-list-disc notion-block-aca310923c14454ca1e9e9e8d84a16db"><li>背后的趋势判断：AI 基础层商业化已非常成熟；竞争焦点从模型能力转向 ecosystem（distribution、infra、legal positioning）</li></ul><ul class="notion-list notion-list-disc notion-block-7546c029acb84a52be7a3ca1d3e8bc62"><li>对 side project / 求职 / 项目方向的启发：做 AI 应用比做模型更有机会；找「API 能解决但竞争还不激烈」的垂直方向（如攀岩 app）</li></ul><ul class="notion-list notion-list-disc notion-block-ad47e929f8a2467aa225896bb9be4113"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://crescendo.ai/news/latest-ai-news-and-updates">https://crescendo.ai/news/latest-ai-news-and-updates</a></li></ul><div class="notion-text notion-block-e60b68a792974d639e81e3efcc297311"><b>【2026 年 = AI 从炒作到实用的转折年（TechCrunch / MIT Tech Review）】</b></div><ul class="notion-list notion-list-disc notion-block-70e578359afb4694ab40562e21532b04"><li>动态：多个权威媒体预判 2026 AI 转向 pragmatism；重点是 smaller models、physical device embedding、human workflow integration</li></ul><ul class="notion-list notion-list-disc notion-block-406405f268b64b82931206270bb34d53"><li>背后的趋势判断：大模型能力到顶，差异化在应用层；垂直场景 + 实际可用性 &gt; 更大参数</li></ul><ul class="notion-list notion-list-disc notion-block-75584f46122c4304a7b94f45be2b4223"><li>对 side project / 求职 / 项目方向的启发：做垂直场景 AI 应用比通用工具更有差异化；「能落地」比「懂前沿」更受欢迎</li></ul><ul class="notion-list notion-list-disc notion-block-08e2a9502b544e948278d82eb5401c09"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://techcrunch.com/2026/01/02/in-2026-ai-will-move-from-hype-to-pragmatism/">https://techcrunch.com/2026/01/02/in-2026-ai-will-move-from-hype-to-pragmatism/</a></li></ul><div class="notion-text notion-block-a7822b0ec3da46d9a777dad88c662cec"><b>【Luma AI Uni-1：图像理解 + 生成统一架构】</b></div><ul class="notion-list notion-list-disc notion-block-7de426e9081646c0947919f4f7bb4ce3"><li>动态：Luma AI 发布 Uni-1，将图像理解和生成整合在单一架构，推理时「边想边生成」（待验证细节）</li></ul><ul class="notion-list notion-list-disc notion-block-4fda30f90b454a90a43a7f6561ec0e79"><li>背后的趋势判断：understand + generate 统一是多模态下一步；Luma 挑战 OpenAI/Google 多模态领地</li></ul><ul class="notion-list notion-list-disc notion-block-bdb9e7439b7e47cabad7520d2b1c08dc"><li>对 side project / 求职 / 项目方向的启发：「先看视频再提建议」的应用场景中有潜力；可关注 Luma API</li></ul><ul class="notion-list notion-list-disc notion-block-b2b524db42484be7ae493dfee523b079"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://crescendo.ai/news/latest-ai-news-and-updates">https://crescendo.ai/news/latest-ai-news-and-updates</a></li></ul><hr class="notion-hr notion-block-27f7be712fd541b5b644d514b564cff9"/><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-7b5339f74ac34c3294b5885ad0740f50" data-id="7b5339f74ac34c3294b5885ad0740f50"><span><div id="7b5339f74ac34c3294b5885ad0740f50" class="notion-header-anchor"></div><a class="notion-hash-link" href="#7b5339f74ac34c3294b5885ad0740f50" title="E. 学习价值 / 求职价值"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">E. 学习价值 / 求职价值</span></span></h4><div class="notion-text notion-block-be3678405c5b4dc096d4b75674c70700"><b>【Helium 论文（arXiv:2603.16104）：LLM Infra 面试的优质素材】</b></div><ul class="notion-list notion-list-disc notion-block-c4f40294077d4cd3b8f67b780b6a027f"><li>内容：用 data systems 视角优化 agentic workflow serving；proactive KV caching + cache-aware scheduling，比 vLLM 快 1.56x</li></ul><ul class="notion-list notion-list-disc notion-block-696349acfe6b43d5b4c794790892500f"><li>适合我怎么用：精读 + 面试表达。能讲清楚「为什么 vLLM 对 multi-step agent 不够优化，Helium 如何从 workflow 视角解决」，体现 LLM infra 深度</li></ul><ul class="notion-list notion-list-disc notion-block-483cca1694c642078dac554dcab17869"><li>推荐动作：精读 abstract + intro + design section；准备 2 分钟讲解</li></ul><ul class="notion-list notion-list-disc notion-block-ad74a38fbe4f45ffb7b65f8607ab0967"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2603.16104">https://arxiv.org/abs/2603.16104</a></li></ul><div class="notion-text notion-block-69d92dbc90784663a77aa1fece4f7bff"><b>【ClimbingCap + Sports Feedback 论文：项目背书 + 面试差异化】</b></div><ul class="notion-list notion-list-disc notion-block-487696ba1ac448d0999819e9f227c865"><li>内容：两篇直接针对攀岩 AI 的论文，是你 app 项目的最强学术背书</li></ul><ul class="notion-list notion-list-disc notion-block-2b7e3ae4f4864b47a6b5b3ca2d5d072a"><li>适合我怎么用：精读 + 项目路线图 + 面试表达。简历/portfolio 可写「参考 CVPR 2025 ClimbingCap + arXiv:2602.08996 构建攀岩动作分析 pipeline」</li></ul><ul class="notion-list notion-list-disc notion-block-d090271f83874dec9f1c8eb426c938db"><li>推荐动作：精读两篇；在项目 README 中引用；面试时作为「了解 domain-specific AI research」的证据</li></ul><ul class="notion-list notion-list-disc notion-block-8fdeca3ac4e54bccb920e2b2f01bc19b"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2503.21268">https://arxiv.org/abs/2503.21268</a> | <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2602.08996">https://arxiv.org/abs/2602.08996</a></li></ul><div class="notion-text notion-block-9ad80b5eba8d46e8bd803adbb6a331e5"><b>【OpenHands 上手实践：coding agent 面试 + 开发加速双收】</b></div><ul class="notion-list notion-list-disc notion-block-e8743521bbd044b8871919590ad50402"><li>内容：目前最成熟的开源 coding agent，Docker 本地运行，文档完整</li></ul><ul class="notion-list notion-list-disc notion-block-84eb29696eba42bda6bcdd31cc907ada"><li>适合我怎么用：复现 + 面试表达。部署并用它解决攀岩 app 中的真实 coding task，作为「我在日常开发中使用 agent workflow」的具体案例</li></ul><ul class="notion-list notion-list-disc notion-block-cbff170198154c108d412a604b8adfc6"><li>推荐动作：本周内部署，完成一个真实任务，截图记录 workflow，写进项目经历</li></ul><ul class="notion-list notion-list-disc notion-block-f279adb5ee514e528c3aadf4101fce5c"><li>🔗 <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/OpenHands/OpenHands">https://github.com/OpenHands/OpenHands</a></li></ul><hr class="notion-hr notion-block-40e5a188dc2e4ca0a167f1adc72932da"/><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-d92018a25a0c4867b0e2dd99e2fc716d" data-id="d92018a25a0c4867b0e2dd99e2fc716d"><span><div id="d92018a25a0c4867b0e2dd99e2fc716d" class="notion-header-anchor"></div><a class="notion-hash-link" href="#d92018a25a0c4867b0e2dd99e2fc716d" title="三、今日高分 GitHub Repo"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">三、今日高分 GitHub Repo</span></span></h3><div class="notion-text notion-block-84f83915140542e7ab1f6be5270d1d98"><b>Repo 1：microsoft/BitNet</b></div><ul class="notion-list notion-list-disc notion-block-e07334ee71a84a59bde6c7256518e741"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/microsoft/BitNet">https://github.com/microsoft/BitNet</a></li></ul><ul class="notion-list notion-list-disc notion-block-124bffde1e82428d85d83fb01e8b75b8"><li>方向标签：infra / deployment / edge</li></ul><ul class="notion-list notion-list-disc notion-block-0aea4f49b7e84a419d652ef20e75b323"><li>这项目是干什么的：Microsoft 官方 1-bit LLM 推理框架，CPU 上高效运行 1-bit LLMs（BitNet b1.58），无需 GPU</li></ul><ul class="notion-list notion-list-disc notion-block-65f2d5bc56ff49b7bf6da61b792bc729"><li>为什么今天值得关注：本周 HN 370 points、169 条评论；3 月持续 GitHub trending；edge AI 关键基础设施</li></ul><ul class="notion-list notion-list-disc notion-block-6f261429f50f472896b2aab9abe26cec"><li>与我的相关性：攀岩 app mobile 部署路径；手机端无 GPU 推理的核心技术选型</li></ul><ul class="notion-list notion-list-disc notion-block-79265c47fc304f2a8e086d5dbd36e526"><li>上手成本：中（需了解 quantization 基础）</li></ul><ul class="notion-list notion-list-disc notion-block-736a57fe69ad474aa22a581cb8a33dbb"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-b9b68d1e58e348cf88cac9acdcdd143f"><li>是否建议我复现：可先跑 demo 验证 CPU 速度（低门槛）</li></ul><ul class="notion-list notion-list-disc notion-block-1273d68aec21420e969abdabae74d2f1"><li>一句话判断：edge AI 重要基础设施，今天了解原理，中期作为 mobile 部署备选</li></ul><div class="notion-text notion-block-cad90d26ed0b454a93610981b15524f3"><b>Repo 2：OpenHands/OpenHands</b></div><ul class="notion-list notion-list-disc notion-block-3d1d93be5fc34a3d988d71a0ad9f536b"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/OpenHands/OpenHands">https://github.com/OpenHands/OpenHands</a></li></ul><ul class="notion-list notion-list-disc notion-block-0d7963eb4acf40ba83ac6e011d2c5e6d"><li>方向标签：agent / coding / dev tools</li></ul><ul class="notion-list notion-list-disc notion-block-f2dbca0624e64a918748ee2818b001d9"><li>这项目是干什么的：开源 AI coding agent 平台，自主写代码/修 bug/跑测试，支持多种 LLM 后端</li></ul><ul class="notion-list notion-list-disc notion-block-eba8945baad247e39fa6d3dd33c3b4a6"><li>为什么今天值得关注：69K stars；72% SWE-Bench Verified；目前最成熟的开源 coding agent</li></ul><ul class="notion-list notion-list-disc notion-block-20a9bb05da014d5aa717237c04d64773"><li>与我的相关性：直接加速攀岩 app 开发；可作为 agent 系统架构参考</li></ul><ul class="notion-list notion-list-disc notion-block-b596419ce3644006a09037409ae19195"><li>上手成本：低（Docker 一键部署）</li></ul><ul class="notion-list notion-list-disc notion-block-458ea8d0a1724291829ed96e06289daf"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-1fe26b3f4ee040298371bdf56ea72d07"><li>是否建议我复现：强烈建议——这周就跑起来</li></ul><ul class="notion-list notion-list-disc notion-block-5fa424959b5644f98fcc609f2a2b6644"><li>一句话判断：目前最值得上手的开源 coding agent，不需要等，直接用</li></ul><div class="notion-text notion-block-ada61845031443f48b46fca9b74c0107"><b>Repo 3：VoltAgent/awesome-ai-agent-papers</b></div><ul class="notion-list notion-list-disc notion-block-9ce084f8a3d643b7aad3c3e53261bfe6"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/VoltAgent/awesome-ai-agent-papers">https://github.com/VoltAgent/awesome-ai-agent-papers</a></li></ul><ul class="notion-list notion-list-disc notion-block-81fb394601da453d8494648cfd9a66c4"><li>方向标签：agent / research / curated</li></ul><ul class="notion-list notion-list-disc notion-block-9d7b8cc610594cf59b8fc457c1ffa644"><li>这项目是干什么的：2026 年 AI agent 论文精选列表，覆盖 agent engineering、memory、evaluation，持续更新</li></ul><ul class="notion-list notion-list-disc notion-block-7ee82d8cfeb6467ba5033059ac126be0"><li>为什么今天值得关注：高质量维护，帮你追踪 agent 前沿研究不错过重要论文</li></ul><ul class="notion-list notion-list-disc notion-block-2aca30faf03d457f8b79e8e0bb043214"><li>与我的相关性：agent workflow 研究跟踪</li></ul><ul class="notion-list notion-list-disc notion-block-dbfbd92ddb3f49db920cf9ec83f47114"><li>上手成本：低（直接看 README）</li></ul><ul class="notion-list notion-list-disc notion-block-9e572f155b6a4be78f8cedf566867399"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-2d3498a622954628a7b7f003c7463f5e"><li>是否建议我复现：否</li></ul><ul class="notion-list notion-list-disc notion-block-67e4db7bdfc14d92b2ebb1cf3fb15bf0"><li>一句话判断：agent 论文 RSS 替代，收藏即可</li></ul><div class="notion-text notion-block-25dda308633642dd84ad69e0c3d8deec"><b>Repo 4：ClimbingCap 项目主页（CVPR 2025）</b></div><ul class="notion-list notion-list-disc notion-block-6e8ad5553b6e46f395b42520d92af8a0"><li>项目链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://www.lidarhumanmotion.net/climbingcap/">http://www.lidarhumanmotion.net/climbingcap/</a></li></ul><ul class="notion-list notion-list-disc notion-block-aa07340c85a04e2094bd1bd8de3929d5"><li>方向标签：video / motion / sports / multimodal</li></ul><ul class="notion-list notion-list-disc notion-block-d937a4074b58491faf3d3863c538eb14"><li>这项目是干什么的：CVPR 2025 攀岩运动捕捉数据集 + 方法，world coordinate 下的 3D 攀岩动作重建</li></ul><ul class="notion-list notion-list-disc notion-block-4c32a94a7b4e400fb3f2e4fec90ffaa9"><li>为什么今天值得关注：与你的攀岩 app 100% 直接相关；目前最完整的攀岩动作数据集</li></ul><ul class="notion-list notion-list-disc notion-block-311e6499bd134208b7589aa87245469c"><li>与我的相关性：极高——数据集、方法设计、semi-supervised training 都可以直接参考</li></ul><ul class="notion-list notion-list-disc notion-block-7491be87526c4e908f39dbc37a1cf531"><li>上手成本：高（LiDAR + 3D pose 专业知识），RGB-only 简化方案门槛中等</li></ul><ul class="notion-list notion-list-disc notion-block-66ea57e8676d41aca56bdf4f3b9bbd78"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-249b5557051640279364ba4b87489108"><li>是否建议我复现：中期目标——先读论文，了解数据集，再决定复现策略</li></ul><ul class="notion-list notion-list-disc notion-block-a3d589f76e0b445f9a7eff5a3c3d012e"><li>一句话判断：攀岩 app 必读论文对应数据集，今天先收藏项目主页，再联系作者申请数据</li></ul><div class="notion-text notion-block-bb0f0b8bbeaf4985bf43835e16e22f17"><b>Repo 5：caramaschiHG/awesome-ai-agents-2026</b></div><ul class="notion-list notion-list-disc notion-block-02f301d27c034f649e41f66fea1bc925"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/caramaschiHG/awesome-ai-agents-2026">https://github.com/caramaschiHG/awesome-ai-agents-2026</a></li></ul><ul class="notion-list notion-list-disc notion-block-4926ef17188741b09d109efbc101b3d7"><li>方向标签：agent / curated / dev tools</li></ul><ul class="notion-list notion-list-disc notion-block-fcf2d018198f42b7aabf7af72bbc6e9d"><li>这项目是干什么的：2026 年 AI agent 框架和工具综合列表，300+ 资源，20+ 类别，每月更新</li></ul><ul class="notion-list notion-list-disc notion-block-08a73bceab914e178b18772caf0d5c3e"><li>为什么今天值得关注：持续维护，覆盖最新 agent 生态</li></ul><ul class="notion-list notion-list-disc notion-block-f1f3444806a74a11843aa2e5b868f149"><li>与我的相关性：帮你快速找到适合攀岩 app 的 agent framework</li></ul><ul class="notion-list notion-list-disc notion-block-cb4b97cd52b54c31b6fb753340f163e5"><li>上手成本：低</li></ul><ul class="notion-list notion-list-disc notion-block-b9931661bbd24ccabc97817cce5a6189"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-f1c4c750b31f47278a35437fac4ed34e"><li>是否建议我复现：否</li></ul><ul class="notion-list notion-list-disc notion-block-1f99bcf7207a428d9ee5ebb8411c9ae4"><li>一句话判断：agent 生态地图，收藏备查</li></ul><div class="notion-text notion-block-2ebe293fad3c4a18ac8f1276dcbe211c"><b>Repo 6：OpenHands Index（多维 eval 体系）</b></div><ul class="notion-list notion-list-disc notion-block-e725281af79d4d73aa620ddf460eb146"><li>GitHub 链接：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/OpenHands/OpenHands（eval">https://github.com/OpenHands/OpenHands（eval</a> harness 内嵌主仓库）</li></ul><ul class="notion-list notion-list-disc notion-block-f7796451c292406caaf9482d84c358b0"><li>方向标签：eval / agent / benchmark</li></ul><ul class="notion-list notion-list-disc notion-block-3017d34e58a64935ad890946d7364b4c"><li>这项目是干什么的：OpenHands 的多维度 coding agent 评估体系，覆盖 issue resolution、greenfield development、frontend 等</li></ul><ul class="notion-list notion-list-disc notion-block-19495219904b49f9ad5a0d97d3138344"><li>为什么今天值得关注：了解 coding agent 的 eval 方法是面试加分项；可参考来设计你的攀岩反馈评估</li></ul><ul class="notion-list notion-list-disc notion-block-228b233dd88148a0bea33a1f85ac0275"><li>与我的相关性：学习如何设计 evaluation（攀岩动作反馈质量评估的专用指标）</li></ul><ul class="notion-list notion-list-disc notion-block-d33fce2304334c3b9357e0fc527eea4d"><li>上手成本：中</li></ul><ul class="notion-list notion-list-disc notion-block-45f0166bda444fd89636eced41d60ed2"><li>是否建议我收藏：是</li></ul><ul class="notion-list notion-list-disc notion-block-86f041d09b6a405a8b0f87b578770297"><li>是否建议我复现：选做</li></ul><ul class="notion-list notion-list-disc notion-block-295bcab5d0434328afc7fce39ee9e376"><li>一句话判断：学习 agent eval 设计的好教材</li></ul><hr class="notion-hr notion-block-b691289c53e34332a74330b44b63bf70"/><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-d670b13eda524bd6adf6a4fac562b861" data-id="d670b13eda524bd6adf6a4fac562b861"><span><div id="d670b13eda524bd6adf6a4fac562b861" class="notion-header-anchor"></div><a class="notion-hash-link" href="#d670b13eda524bd6adf6a4fac562b861" title="四、今日最值得看的 3 个链接"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">四、今日最值得看的 3 个链接</span></span></h3><div class="notion-text notion-block-cb5ba0dda5594b45904098e22494c1e8"><b>🥇 第一优先：arXiv:2602.08996——攀岩反馈生成论文</b></div><div class="notion-text notion-block-e1f45769c9bc4ee6a318a1bd3a3419e7"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2602.08996">https://arxiv.org/abs/2602.08996</a></div><div class="notion-text notion-block-405bcafb60f24843b9f02e35fd846b7c">为什么：这是目前 AI 学术界唯一直接研究「Video-LLM 给攀岩视频生成动作反馈」的论文，和你的 app 方向完全重合。读完你就知道学术上哪些问题已有解法、哪些还是 open problem、评估指标应该怎么设计。今天就读。</div><div class="notion-text notion-block-fc2e414c9d27435b88bec813ecb8b53a"><b>🥈 第二优先：arXiv:2503.21268——ClimbingCap CVPR 2025</b></div><div class="notion-text notion-block-bf592a4876f942b8ac32b1b4d6358b8a"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2503.21268">https://arxiv.org/abs/2503.21268</a></div><div class="notion-text notion-block-d79606065f324003a9037f82571e860b">为什么：CVPR 级别的攀岩运动捕捉数据集和方法，是你项目的技术根基。需要了解数据集是否可以申请使用，以及 RGB-only 简化方案是否可行。读完访问项目主页看数据集申请：<a target="_blank" rel="noopener noreferrer" class="notion-link" href="http://www.lidarhumanmotion.net/climbingcap/">http://www.lidarhumanmotion.net/climbingcap/</a></div><div class="notion-text notion-block-c689b3c6ec9246d888299e5bf9be68ed"><b>🥉 第三优先：OpenHands Index Blog (2026-01-28)</b></div><div class="notion-text notion-block-5328181004ea40ea8245e6f2b929b329"><a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://openhands.dev/blog/openhands-index">https://openhands.dev/blog/openhands-index</a></div><div class="notion-text notion-block-68c2a39782d2489ca90f59964f425b2a">为什么：直接了解最成熟开源 coding agent 的能力边界，帮你判断「agent 能帮我做什么/不能做什么」。读完就去把 OpenHands Docker 跑起来——今天就能试。</div><hr class="notion-hr notion-block-e99f874a840b489582926d3ac30986b0"/><h3 class="notion-h notion-h2 notion-h-indent-0 notion-block-cc895da4c31648d8afb2a219c39d9129" data-id="cc895da4c31648d8afb2a219c39d9129"><span><div id="cc895da4c31648d8afb2a219c39d9129" class="notion-header-anchor"></div><a class="notion-hash-link" href="#cc895da4c31648d8afb2a219c39d9129" title="五、今日行动清单"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">五、今日行动清单</span></span></h3><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-690768b761714d6db615f008ad83f69b" data-id="690768b761714d6db615f008ad83f69b"><span><div id="690768b761714d6db615f008ad83f69b" class="notion-header-anchor"></div><a class="notion-hash-link" href="#690768b761714d6db615f008ad83f69b" title="1. 今天值得收藏但不必立刻看的"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">1. 今天值得收藏但不必立刻看的</span></span></h4><ul class="notion-list notion-list-disc notion-block-961152def8fa4145b1de48cf27606bf4"><li>microsoft/BitNet — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/microsoft/BitNet">https://github.com/microsoft/BitNet</a></li></ul><ul class="notion-list notion-list-disc notion-block-60e523b6403644d19db5352e35664534"><li>VoltAgent/awesome-ai-agent-papers — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://github.com/VoltAgent/awesome-ai-agent-papers">https://github.com/VoltAgent/awesome-ai-agent-papers</a></li></ul><ul class="notion-list notion-list-disc notion-block-015209ee89fc4ccaadff17fc93200a14"><li>Helium serving 论文 — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2603.16104">https://arxiv.org/abs/2603.16104</a></li></ul><ul class="notion-list notion-list-disc notion-block-c6b24dc02cfa4e48b98d62b027adc4d8"><li>Commercial vision sensors for sports review (PMC) — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://pmc.ncbi.nlm.nih.gov/articles/PMC12378739/">https://pmc.ncbi.nlm.nih.gov/articles/PMC12378739/</a></li></ul><ul class="notion-list notion-list-disc notion-block-d93f724b86544425b88ca1908f640e52"><li>HyEvo self-evolving agentic workflow — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://arxiv.org/abs/2603.19639">https://arxiv.org/abs/2603.19639</a></li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-b2c8b57a412d47769a95d02f925c5112" data-id="b2c8b57a412d47769a95d02f925c5112"><span><div id="b2c8b57a412d47769a95d02f925c5112" class="notion-header-anchor"></div><a class="notion-hash-link" href="#b2c8b57a412d47769a95d02f925c5112" title="2. 今天值得精读的"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">2. 今天值得精读的</span></span></h4><ul class="notion-list notion-list-disc notion-block-061a9b14208b443ab0ed9b3d51e0135a"><li><b>arXiv:2602.08996</b>（攀岩反馈生成）——重点看 method + evaluation metric 设计</li></ul><ul class="notion-list notion-list-disc notion-block-9a44bee0fb854dbfb61059c207725f21"><li><b>arXiv:2503.21268</b>（ClimbingCap）——重点看数据集规模和 RGB-only 方案可行性</li></ul><ul class="notion-list notion-list-disc notion-block-03f54e64f8a14a6c9bc3432de5d3ee8f"><li><b>OpenHands Index blog</b> — <a target="_blank" rel="noopener noreferrer" class="notion-link" href="https://openhands.dev/blog/openhands-index">https://openhands.dev/blog/openhands-index</a></li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-810cb8d4df1e4deea49ab31b24f7e8a9" data-id="810cb8d4df1e4deea49ab31b24f7e8a9"><span><div id="810cb8d4df1e4deea49ab31b24f7e8a9" class="notion-header-anchor"></div><a class="notion-hash-link" href="#810cb8d4df1e4deea49ab31b24f7e8a9" title="3. 今天值得复现 / 试用的"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">3. 今天值得复现 / 试用的</span></span></h4><ul class="notion-list notion-list-disc notion-block-968140bf0909434badeacfe3277388d0"><li><b>OpenHands 本地部署</b>：Docker 一键跑起来，用攀岩 app 代码库让 agent 解决一个真实 bug 或写一个模块</li></ul><ul class="notion-list notion-list-disc notion-block-25d47f2572fc458da2861b0634ed41a2"><li><b>BitNet.cpp demo</b>：验证 CPU inference 速度，感受 1-bit LLM 实际表现（可选，低门槛）</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-537822f2ba7246de87539dae49f641a9" data-id="537822f2ba7246de87539dae49f641a9"><span><div id="537822f2ba7246de87539dae49f641a9" class="notion-header-anchor"></div><a class="notion-hash-link" href="#537822f2ba7246de87539dae49f641a9" title="4. 今天值得记到项目 Roadmap 的"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">4. 今天值得记到项目 Roadmap 的</span></span></h4><ul class="notion-list notion-list-disc notion-block-62cff097cee84a58b352db8828b1539c"><li><b>攀岩 app 数据策略</b>：参考 arXiv:2602.08996，用 YouTube 攀岩比赛视频 + 教练手册作为辅助数据（免费可获取）</li></ul><ul class="notion-list notion-list-disc notion-block-50e10241967b46ebb2909883f0378e98"><li><b>Feedback 评估指标</b>：不用 BLEU/ROUGE，需设计运动反馈专用评估指标（论文中有讨论）</li></ul><ul class="notion-list notion-list-disc notion-block-99e80c71efbb47c5b20e6dccca6fd863"><li><b>ClimbingCap 数据集</b>：联系作者申请 AscendMotion；或规划 RGB-only 简化方案</li></ul><ul class="notion-list notion-list-disc notion-block-6e6a30ca209c4b72ade0d1b358430280"><li><b>Edge 部署路径</b>：BitNet 列为长期 mobile 部署技术备选</li></ul><ul class="notion-list notion-list-disc notion-block-ae4f02e5cdf34d05b967eae2affd467a"><li><b>LLM 选型</b>：Gemini 3.1 Flash-Lite（$0.25/M tokens）纳入视频帧批量分析成本评估</li></ul><h4 class="notion-h notion-h3 notion-h-indent-1 notion-block-952fbf25c9b44468adb3ba3ac43ac736" data-id="952fbf25c9b44468adb3ba3ac43ac736"><span><div id="952fbf25c9b44468adb3ba3ac43ac736" class="notion-header-anchor"></div><a class="notion-hash-link" href="#952fbf25c9b44468adb3ba3ac43ac736" title="5. 今天面试里可以拿来讲的 1-2 个点"><svg viewBox="0 0 16 16" width="16" height="16"><path fill-rule="evenodd" d="M7.775 3.275a.75.75 0 001.06 1.06l1.25-1.25a2 2 0 112.83 2.83l-2.5 2.5a2 2 0 01-2.83 0 .75.75 0 00-1.06 1.06 3.5 3.5 0 004.95 0l2.5-2.5a3.5 3.5 0 00-4.95-4.95l-1.25 1.25zm-4.69 9.64a2 2 0 010-2.83l2.5-2.5a2 2 0 012.83 0 .75.75 0 001.06-1.06 3.5 3.5 0 00-4.95 0l-2.5 2.5a3.5 3.5 0 004.95 4.95l1.25-1.25a.75.75 0 00-1.06-1.06l-1.25 1.25a2 2 0 01-2.83 0z"></path></svg></a><span class="notion-h-title">5. 今天面试里可以拿来讲的 1-2 个点</span></span></h4><div class="notion-text notion-block-06c6c10bd52c4257883c7d33f50a656e"><b>点 1（项目深度）：</b>「我在构建攀岩动作分析 app 时，调研到 CVPR 2025 的 ClimbingCap 和 2026 年 2 月的 arXiv 论文（2602.08996），后者专门研究用 Video-LLM 生成攀岩反馈建议。论文还指出 BLEU/ROUGE 等传统 NLP 指标不适合评估运动反馈质量，我正在设计专用评估指标。」——展示：domain research 深度 + eval 设计认知</div><div class="notion-text notion-block-386adf5583ae407499bc5b3f6c646b29"><b>点 2（LLM Infra）：</b>「我读了最近叫 Helium 的论文（arXiv:2603.16104），它把 multi-step agentic workflow 用 data systems 视角重新建模——把 LLM serving 的优化单元从单次 inference call 扩展到整个 workflow 的 query plan，通过 proactive KV caching 实现最高 1.56x 加速。这让我理解了为什么 vLLM 对 agent 场景效率不足。」——展示：LLM infra 知识深度 + 独立阅读研究论文能力</div></main></div>]]></content:encoded>
        </item>
    </channel>
</rss>