{"id":338335,"date":"2025-06-18T04:58:34","date_gmt":"2025-06-18T11:58:34","guid":{"rendered":"https:\/\/cms-articles.softonic.io\/en\/?p=338335"},"modified":"2025-07-01T14:21:58","modified_gmt":"2025-07-01T21:21:58","slug":"apple-intelligence-outperforms-openai-in-this-feature-55-faster-private-and-local","status":"publish","type":"post","link":"https:\/\/cms-articles.softonic.io\/en\/apple-intelligence-outperforms-openai-in-this-feature-55-faster-private-and-local\/","title":{"rendered":"Apple Intelligence outperforms OpenAI in this feature: 55% faster, private, and local"},"content":{"rendered":"\n<p>Apple has quietly taken a major step in speech-to-text technology with the release of new transcription tools in iOS 26 and macOS Tahoe.&nbsp;<strong>In internal tests, Apple\u2019s frameworks have matched the accuracy of OpenAI\u2019s Whisper<\/strong>&nbsp;while offering a dramatic improvement in speed. These tools, now available in developer betas, promise&nbsp;<strong>real-time, private and fully on-device transcription<\/strong>, marking a significant leap forward for Apple Intelligence.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">A new transcription engine built by Apple<\/h2>\n\n\n\n<p>Developers working with macOS Tahoe have access to two new modules:&nbsp;<strong>SpeechAnalyzer and SpeechTranscriber<\/strong>, both built into Apple\u2019s existing speech recognition framework. These tools are not tied to the keyboard like Dictation, so they can be embedded into any app or utility.<\/p>\n\n\n\n<p>When tested using a 34-minute video, a simple tool called Yap\u2014built in under 10 minutes using Apple\u2019s new APIs\u2014completed the transcription in&nbsp;<strong>just 45 seconds<\/strong>, compared to&nbsp;<strong>1 minute 41 seconds for MacWhisper<\/strong>&nbsp;using OpenAI\u2019s Large V3 Turbo model. VidCap, another popular Whisper-based app, took nearly&nbsp;<strong>two full minutes<\/strong>.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">The benefits go beyond speed<\/h2>\n\n\n\n<p>Aside from the&nbsp;<strong>55% faster performance<\/strong>, Apple\u2019s approach brings&nbsp;<strong>strong privacy advantages<\/strong>, as all transcription is done locally. This eliminates the need to send data to external servers, aligning with Apple\u2019s longstanding emphasis on user privacy. For students, journalists or anyone needing regular transcriptions, this performance gain translates into&nbsp;<strong>significant time savings<\/strong>&nbsp;across multiple files.<\/p>\n\n\n\n<p>These new capabilities position Apple as a serious competitor in the transcription space, and show how Apple Intelligence is evolving to challenge existing AI leaders\u2014not with hype, but with&nbsp;<strong>measurable improvements in both speed and privacy<\/strong>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Apple has quietly taken a major step in speech-to-text technology with the release of new transcription tools in iOS 26 and macOS Tahoe.&nbsp;In internal tests, Apple\u2019s frameworks have matched the accuracy of OpenAI\u2019s Whisper&nbsp;while offering a dramatic improvement in speed. These tools, now available in developer betas, promise&nbsp;real-time, private and fully on-device transcription, marking a &hellip; <a href=\"https:\/\/cms-articles.softonic.io\/en\/apple-intelligence-outperforms-openai-in-this-feature-55-faster-private-and-local\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Apple Intelligence outperforms OpenAI in this feature: 55% faster, private, and local&#8221;<\/span><\/a><\/p>\n","protected":false},"author":9317,"featured_media":338336,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":"","wpcf-pageviews":0},"categories":[1015],"tags":[],"usertag":[],"vertical":[],"content-category":[],"class_list":["post-338335","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/posts\/338335","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/users\/9317"}],"replies":[{"embeddable":true,"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/comments?post=338335"}],"version-history":[{"count":1,"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/posts\/338335\/revisions"}],"predecessor-version":[{"id":338337,"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/posts\/338335\/revisions\/338337"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/media\/338336"}],"wp:attachment":[{"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/media?parent=338335"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/categories?post=338335"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/tags?post=338335"},{"taxonomy":"usertag","embeddable":true,"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/usertag?post=338335"},{"taxonomy":"vertical","embeddable":true,"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/vertical?post=338335"},{"taxonomy":"content-category","embeddable":true,"href":"https:\/\/cms-articles.softonic.io\/en\/wp-json\/wp\/v2\/content-category?post=338335"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}