Apple Acquires Rapid-Fire Camera App Developer SnappyLabs


Next Story

Apple has acquired the one-man photo technology startup SnappyLabs, maker of SnappyCam, sources tell me. The startup was founded and run solely by John Papandriopoulos, an electrical engineering PhD from the University Of Melbourne who invented a way to make the iPhone’s camera take full-resolution photos at 20 to 30 frames per second — significantly faster than Apple’s native iPhone camera.

I first noticed something was up when we got tipped off that SnappyCam had disappeared from the App Store and all of SnappyLabswebsites went blank. Sources have since affirmed that the company was acquired by Apple, and that there was also acquisition interest “from most of the usual players”, meaning other tech giants. I don’t have details on the terms of the deal, and I’m awaiting a response from Apple, which has not confirmed the acquisition.

But based on Papandriopoulos’ scientific breakthroughs in photography technology, it’s not hard to see why Apple would want to bring him in to help improve their cameras. The stragic acquisition of an extremely lean, hard technology-focused team (of one) fits with Apple’s MO. It typically buys smaller teams to work on specific products rather than buying big staffs and trying to blend them in across the company.

snappycam-features

Papandriopoulos built his burst-mode photo technology into SnappyCam, which he sold in the Apple App Store for $1. After I profiled the app in July, Papandriopoulos told me SnappyCam jumped to #1 on the paid app chart in nine countries. Sales of the app let him run SnappyLabs without big funding from venture capital firms.

Back in July, Papandriopoulos told me he had a eureka moment in “discrete cosine transform JPG science” and had essentially reinvented the JPG image format. In a blog postnow taken down, the SnappyLabs founder explained

John Papandriopoulos“First we studied the fast discrete cosine transform (DCT) algorithms…We then extended some of that research to create a new algorithm that’s a good fit for the ARM NEON SIMD co-processor instruction set architecture. The final implementation comprises nearly 10,000 lines of hand-tuned assembly code, and over 20,000 lines of low-level C code. (In comparison, the SnappyCam app comprises almost 50,000 lines of Objective C code.)

JPEG compression comprises two parts: the DCT (above), and a lossless Huffman compression stage that forms a compact JPEG file. Having developed a blazing fast DCT implementation, Huffman then became a bottleneck. We innovated on that portion with tight hand-tuned assembly code that leverages special features of the ARM processor instruction set to make it as fast as possible.”

By bringing Papandriopoulos in-house, Apple could build this technology and more into its iPhone, iPad, Mac, and MacBook cameras. Photography is a core use for smartphones, and offering high-resolution, rapid-fire burst mode shooting could become a selling point for iPhones over competing phones.

And in case you were wondering if Papandriopoulos will be a good fit at Apple, he once dressed as an iPhone at a San Francisco parade.

29504_427772472017_7042308_n

For more on SnappyLabs, read my profile of the startup

http://techcrunch.com/2014/01/04/snappylabs/

About these ads