The RealmTeam: Today, we’re launching the Realm Mobile Platform, a new offering that integrates our fully open-source (see below!) client-side database for iOS and Android with new server-side technology providing realtime synchronization, conflict resolution, and reactive event handling. The new platform makes it easy for mobile developers to create apps with difficult-to-build features like realtime collaboration, messaging, offline-first experiences, and more. Our Android developer and I run our company’s apps on top of Realm’s database1. This is great news for apps that need cross-platform (or simply cross-device) synchronisation. CloudKit handles storage and sync across multiple iOS devices. From what I’ve understood from Realm’s article, their service only manages the syncing—you give up the storage in going cross-platform. And, as I mentioned in the footnote, Realm is far better at being a simple database than CoreData—at least for a beginner like me. Look, I’m sure CoreData has tremendous value as you start picking the onion’s layers apart in need of a deeper, precise functionality but Realm is far more manageable for a basic database. ↩︎
Probably the first Apple Music ad on Apple’s YouTube channel since its introduction—the others were on Beats’ YouTube channel. Subpar ads shouldn’t get away with such long (in ad parlance) durations without catching some flac. If this were playing on TV, I’d change the channel.
Joe Rossignol, MacRumors: DisplayMate Technologies has declared iPhone 7 has the "best LCD display" it has ever tested, calling it "truly impressive" and a "major upgrade" over the iPhone 6 based on a series of advanced viewing tests and measurements. Remember the rumours about Apple shifting towards an OLED screen with the iPhone 7? I don’t doubt the idea of Apple choosing and OLED screen in future iPhones (yet another rumour points to Samsung making an OLED display for the 2017 iPhone) because OLED lets the phone’s display blend-in and become ‘one’ with the rest of the phone as it does on Apple Watch — ‘…you can’t determine a boundary between the physical object and the software’. (Probably no more of those ugly black lines around the iPhone’s display either)It’s fascinating that in all likelihood, Apple’s OLED display is going to be equal to or better than the iPhone 7’s LCD display.
Each year new Apple products are featured on the Indian page with a Coming Soon label (or no label at all) alongside1. This time it reads ‘Available 7 October’—for both, iPhone 7 and Watch Series 2 making it the fastest release in India ever. I’d like to take it as a good sign and say it’s a part of Apple’s continued efforts in pushing into India but I don’t think that’s the only reason. I think it’s more so because Apple is able to crank out iPhones faster than before, that they’re getting better at the manufacturing process. (Jet Black iPhones being the exception.) You still can’t preorder iPhones from Apple online in India since Apple doesn’t sell directly here. Online sellers have had pre-orders in the past but the quickest way to have yourself a new iPhone is by booking it at one of Apple’s partnering stores. I pre-booked my iPhone 6S last year and stood in the line (barely twenty-thirty people, nothing compared to the ones in the West) at midnight. This year, I’ll be doing that for the Apple Watch Series Two. Interestingly, October 7th is a week off the third quarter. When Apple announces iPhone sales for Q3—if they announce it—those numbers won’t include Indian sales. (Maybe that works towards Apple’s favour because Indian vendors offer heavy discounts on consumer electronics during Diwali—one of India’s biggest festivals—scheduled this year at the end of October. That is the period, I assume, non-enthusiasts will flock towards considering a purchase.) I looked through archive.org and the iPhone 6S is the only prominent example I could find. The iPad Pro, for example, launched alongside the iPhone 6S and had a ‘Coming Soon’ label along with it. ↩︎
I’ve been seeing tweets that imply the dual-camera system on the iPhone 7-plus would be great for 3D-photography and possibly add to a VR system at some point in the future. Ever since, I’ve wanted to read more about how the dual-camera system would help usher in 3D and I came across this blog-post by Shutterstock’s CEO Jon Oringer that sheds some light on the matter: A flat lens right in front of a sensor (like a typical camera phone lens) doesn’t optically produce [Depth Of Field]. Today’s camera phones don’t have the ability to measure distance, so they can’t digitally re-create the DOF drama that a conventional lens does on its own. This next photo is more like one taken with a camera phone: Most of the image is in focus and there is little depth or drama to the image. Just as our two eyes work together to detect depth, two lenses do the same. By using the disparity of pixels between two lenses, the camera processor can figure out how far away parts of the image are. […] The magic is how software takes information from the two lenses and processes it into an image. Between the extra data collected from this new hardware, and the advancement of machine vision technology, the new iPhone camera is going to be incredible. Depth of Field is one of the last features necessary to complete the full migration from handheld camera to camera phone. Soon both amateur and professional photographers will only need to carry their mobile devices. Let me illustrate what I’ve understood with an example: Say there are two poles in front of you, one a foot away and another ten feet away. In a dual-camera system, one camera can approximate the (relative) distance of the pole a foot away from you and the other can do the same for the one that’s ten feet away from you. The first camera (the one that has a shorter focus) focuses on the first pole and so it must be closer, implying the second is farther—verified by the fact that the second camera can easily focus on it. Now, if you were to place poles between these two poles, each at a distance of one foot from each other, realising how far each pole is is only a question of picking up on how in-focus or out-of-focus that pole is when seen from both cameras. (Allow me to cook up some random, arbitrary numbers here) A pole that is 90% out-of-focus on the first—near-focused—camera and 10% out of focus is the second camera is the second-furthest pole. In a single-camera system you could only measure the fact that an object is 10% out of focus. Whether that means the object is (the equivalent distance of 10%) further away from you or nearer to you wouldn’t be as easy to determine1. This, in effect, gives you the ability to measure in the third dimension (Length and breadth and now depth). Again, this is an over-simplified illustration of what I’ve understood. I could be wrong—I’m not very well familiar with the academics of optics. I’d be happy to correct any bit that I got wrong. (Back in 2011, HTC released a phone called HTC Evo 3D that had a dual-camera system and allowed you to capture a 3D image and view that image on its auto-stereoscopic display. I suppose it was a clunky experience; it never really took off.) Perhaps you could by calculating the time light takes to bounce off an object—further the distance longer the light takes to reach the camera. I assume this is how the LED-assisted focus systems such as L.G.’s G3 work. Also note that Google’s camera app used to figure out the relative distance of objects through a single camera system too but you needed to move the camera around your object as you would for a panorama. ↩︎
The new iPhones made a big leap in camera performance this year, as is the case every year. This time, among the feature list is the addition of a DCI-P3 compliant camera and display. As per custom, Apple showcased some images in their original state to show off the camera’s prowess and I think I noticed some similarities in the photos. You can go have a look at the photos on Apple’s website. Each photo is, generally, highlighting one of two properties—great photos in low-light and pronounced colour in the image; sometimes even a mixture of the two. But to me the absolute stunning photos are the ones where the colour is rich, there’s a healthy variation and a lot of mix-and-match of it. (Have a look at the image of the fair in this BuzzFeed article too.) I think DCI-P3 is playing a large part here1 and these pictures were taken to leverage the new color space. Perhaps what makes these images look brilliant is the fact that one’s eyes are able to pick up on and distinguish between the various colours in these pictures and in the process appreciate the nuance of colour. If that is the case, the difference in iPhones 6S and iPhones 7 cameras would be less apparent in a bland shot that doesn’t have a lot of color variations in the image when compared to—let’s say—a photograph of a rich landscape. I can’t be completely sure about my reasoning of course since I don’t have a side-by-side comparison, I haven’t seen the phones in person, and I am viewing these pictures on a display that doesn’t support DCI-P3 but I am quite confident these pictures were chosen with the new color spec in mind. Of course there’s a lot of work by the other components in the camera system here–the ISP, the sensors etc. ↩︎
The best coverage I’ve seen on Apple’s Irish tax debacle comes (I am restraining myself from using an ‘of course’) from The New Yorker $: People in the know—there aren’t many—simply call it A.O.I., short for Apple Operations International. And this version of Apple is much harder to pin down; it’s something like a quantum corporation whose very nature depends on who is observing it. A.O.I. is, in one sense, huge, among the largest companies that ever existed, with more than two hundred billion dollars in assets. It is also as small as a company can be, with no physical address and no employees. Phillip Bullock, the head of tax operations for Apple, told a U.S. Senate committee in 2013 that “A.O.I. is incorporated in Ireland; thus, under U.S. law it is not tax resident in the U.S.” That seemed clear enough until his next sentence. “A.O.I. is also not tax resident in Ireland because it does not meet the fact-specific residency requirements of Irish law.” It’s Irish, according to American law; not Irish, according to the Irish. A.O.I., in fact, does not legally exist anywhere, even as it takes in much of the profits from Apple sales outside of the United States. You should go read the article while it’s still relevant. Come 7th and removing the headphone jack will be the talk of the town. The article may be pay-walled. ↩︎
Jim Dalrymple, The Loop (via. John Gruber), Apple on Monday send out an invitation for a special event to be held on September 7 at 10:00 am. This year’s event will be held at the Bill Graham Civic Auditorium in San Francisco, California. Bill Graham is where Apple introduced the iPad Pro and Apple TV last year. I’ll be watching this event with a vested interest—I’ve been waiting for the Apple Watch 2 since months. I am buying Watch 2 the day it releases in India. A lot of products are nearing their refresh cycle this month (assuming a traditional 1-year lifetime). The iPhone and—I still need to cross my fingers just in case—the Apple Watch 2 will be announced at the event. That leaves the 12.9-inch iPad Pro, the Apple TV and the Macs in need for a refresh. Out of the three, the Mac is the oldest of the lot and perhaps even needs a mention. My guess is the new Macs — or at least one, enough to tout the new technical innovations and redesign—would be announced. Maybe something like 2013’s Mac Pro reveal? This will also, possibly, set the stage for the rumoured 5K Retina display. I spend my days before an Apple event re-watching all previous Apple events. It’s appetising, somehow. Last year’s September event meant a big deal to me because I’d decided on buying the iPhone 6S as long as it wasn’t a total turd. But the iPhone 6S was an upgrade from my iPhone 5C. This year, the Watch is an entirely new category for me — one that I’ve been looking forward to since a long while. I’m excited! I hope the wait is worth while. Also: If I were buying an iPhone this year, that dark black/piano black colour is gorgeous.
Vlad Savov and James Vincent, The Verge: Acer’s new Predator 21 X is a monster. Not only have this machine’s designers put a curved 21-inch display on a laptop for the first time ever, they’ve also gone and given it two GeForce GTX 1080 GPUs as well. Add in five cooling fans, a 7th-generation Intel Core K-series processor, and space for as much as four terabytes of SSD storage, and you have a laptop that’s beyond obscene. Unveiling the 21 X at IFA in Berlin today, Acer acknowledges that this laptop is more of a proof of engineering acumen than any sort of “big seller” retail product. Each Predator 21 X will be made to order, starting in January of next year (which is how Acer can advertise today that it will have an Intel CPU that technically hasn’t yet been announced). No big deal. New Macs will be out by then, am I right?
Sam Byford, The Verge: Google has “suspended” work on Project Ara, the initiative to build a phone with interchangeable modules for various components like cameras and batteries, according to Reuters and Recode. […] Although Project Ara has always seemed a dubious commercial prospect, the news is surprising if only because Google made a renewed effort to push the modular concept at its I/O conference earlier this year, promising a developer version for fall and a consumer release for 2017. Doesn’t surprise me. Google Glass had a better chance of making it than a modular smartphone—arguably the antithesis of why the smartphone is as revolutionary a product as it is.