Michael McWatters, on Medium, illustrating the Norman Door problem with iOS 10’s split-up Control Center: You swipe up to reveal the Control Center, then swipe sideways between the two panels. This has created a couple usability problems: The only indication that there’s another hidden panel is the carousel-style dots at the bottom of the screen; these are tiny and easy to miss, so users may not even know there’s more than one panel. Once you swipe to a given panel, you have to swipe in the opposite direction to go back, and this is where the Norman Door problem arises. Here’s how this plays out: you swipe up to reveal the Control Center’s Home panel, the default. You need the Now Playing panel, but you’re not sure where it’s hiding, so you swipe right. Wrong! That’s a dead end, but you had a 50% chance so better luck next time! Later, you want to go back to the Home panel, but you’ve forgotten whether it’s off screen to the left or right, so you swipe left again. Wrong! Another dead end. You swipe right and you’re in the right place. His proposed solution: […]treat the two panels as an infinite alternating carousel: no matter which panel you’re on, the other panel is off-screen on either side. While reading Micheal’s article, I was constantly reminded of the way Apple Music solves this problem of indicating ‘There’s more content to the other side’ by showing the user just a bit of said content from the edge of the screen. (If you’re an Apple Music subscriber, open the Music app and switch to the For You tab to have a look for yourself.) Instead, Micheal proposes the edge-revelation method I just described when his readers pointed out that people who use Home-Kit devices have a third Control Center panel. In this case the infinite carousel solution breaks because the user is forced to either hit-and-try, or memorisation (until they’re habituated—of course) to get to the panel they want. But I think the infinite-carousel solution is flawed even if there weren’t the problem of a third panel. Here’s why: The metaphor being used by this panel/screen-swipe gesture is that of flipping the pages of a book. (For iOS apps, the related element is called ‘UIPageViewController’.) It has a sense of familiarity with the analog world where flipping (swiping) forward gets you to the ‘next’ object. Keep going ‘next’ and you reach the end. You never encounter the same object again—like you would in a carousel. If you’re thinking, ‘Yes, but people are comfortable with the digital world now; designers can come up with metaphors native to the digital world’, you’d be right. Allow me, then, to put forth my second argument. Doing an infinite-carousel would break familiarity with other parts of iOS. Consider, for example, the iOS lock-screen—it uses the same paginated control. Lock-screen (with notifications) at the center, widgets and camera to the left and right, respectively. An infinite carousel solution with these screens seems incredibly bleak—even if you chose just two screens. It would be confusing for a user to have to remember two pagination paradigms—one that scrolls in a loop and one that doesn’t. (I suspect this could be the reason why playing music in a playlist doesn’t make songs loop by default; it’s only due to the nature of music listening that music apps include a loop option). The problem still stands though: Control Center does need some attention, and of the existing solutions, I too think the shrink-panes-to-reveal-other-panes solution is the best.
Husain Sumra, MacRumors: The distribution center, which Apple’s global logistics partner DB Schenker will own and operate, will be in the city of Bhiwandi, near the city of Mumbai. An unnamed executive told The Economic Times that the center will “allow Apple to stock its products adequately, will ease operations and streamline its logistics and supply chains.” It will also help Apple maintain consistent pricing for its products. Apple products are pretty much always out of stock here unless they’re a) very popular like the iPhone or b) not popular at all, like the Apple TV. (The iPhone 7 Plus was barely in stock on launch-day and Apple’s retail partners still have the MacBook 101 on display.) Buying an Apple Watch was a terrible experience for me, partly because of the sub-standard retail experience and partly due to the fact that no one had stock. I haven’t found a single place I can buy myself Apple’s woven nylon bands since then—retail or online. I suppose this is some consolation?
Dan Seifert, The Verge: The company had earlier said it would not be releasing a new smartwatch in 2016, but it is now saying that it doesn’t plan to put out a new device timed to the arrival of Google’s newest wearable platform, either. Shakil Barkat, head of global product development at Moto, said the company doesn’t “see enough pull in the market to put [a new smartwatch] out at this time,” though it may revisit the market in the future should technologies for the wrist improve. This is one of the perks you get by being in the Apple ecosystem—modern Apple either doesn’t enter a segment or it pushes heavily forward despite slumping sales. The iPad is a good example. The Moto 360 was one of the better, if not the best, Android smartwatch. I’ve hardly seen people get enthusiastic about Android smartwatches lately, the way they do about Fitbits or Apple Watches. I love my Apple Watch and Apple’s direction for it seems ambitious too. It’s reassuring to know that Apple seems committed to the Watch. I’ve changed my interactions with my iPhone (I use it a lot less now) to incorporate the watch and I like it better this way. I’m glad I don’t see myself in the position some pro users do with their Macs.
Adam, in a post on Medium: This guide is for anyone who is curious about machine learning but has no idea where to start. I imagine there are a lot of people who tried reading the wikipedia article, got frustrated and gave up wishing someone would just give them a high-level explanation. That’s what this is. Machine learning is a domain I want to learn about as a hobby. If you’ve been inquisitive about Machine learning, Adam’s article is a nice resource to get a basic idea. There’s some very basic code written here. A little fact to get you interested: the iPhone 7’s camera uses machine learning when you click an image to find the best picture.
David Brooks writing in The New York Times’ Op-Ed Column: Large parts of popular culture — and pretty much all of stand-up comedy — consist of reducing people to one or another identity and then making jokes about that generalization. The people who worry about cultural appropriation reduce people to an ethnic category and argue that those outside can never understand it. A single identity walls off empathy and the imagination. We’re even seeing a wave of voluntary reductionism. People feel besieged, or they’re intellectually lazy, so they reduce themselves to one category. And: The only way out of this mess is to continually remind ourselves that each human is a conglomeration of identities: ethnic, racial, professional, geographic, religious and so on. Even each identity itself is not one thing but a tradition of debate about the meaning of that identity. Furthermore, the dignity of each person is not found in the racial or ethnic category that each has inherited, but in the moral commitments that each individual has chosen and lived out. Not thinking of people’s choices, characters, likes and dislikes as a binary ‘this-or-that’ is one of the most important things I’ve learned and I wanted to share with you this article (as I struggle to find time to write more often) since it’s a big step along this line of thought. I think this form of either-this-or-that classification exists so much in the tech world too—you’re either among the tech-elite or not1. I’m guilty of writing writing this way too and I’d like to change that. Sure, a common agreement could be that a person can lie somewhere in the middle, be a combination of the two extremes but that still seems like a disservice to our complex and diverse nature. You could be part tech-savvy and part not, but you are also part woman, man, grown-up, childlike, smart, foolish, extroverted, introverted etc. and I think those parts of you add-up and influence your decisions more than an aggregated summary of you would suggest. While it stands to reason that in writing about technology2 it seems convenient to refer to people as ‘tech-people’ or ‘non-tech-people’, I hope that practice doesn’t carry into our understanding of those people. I’ve heard the phrase ‘technologically-challenged’ thrown around constantly. ↩︎ Or any other category—the original NYT story is about the American elections and how the pollsters were way off-mark. ↩︎
I don’t think it’s a stretch to call today’s Town Hall event the ‘Mac event’ considering all the leaks and the ‘hello, again’ wording on the press invites. But among all the talk about new Macs, somewhere, there are hints at refreshed (Pro) software being announced at the event too. First, a tweet from Mark Gurman: A Final Cut Pro website was invited to next week’s Apple event. So yes, a big day for video editors. And second, from Luca Maestri–Apple’s CFO–during the quarterly earnings report (emphasis mine): […]and we’ll have some exciting news to share with current and future Mac owners very soon. Pro software running on a refreshed MacBook Pro would speak to the power of the new hardware but it would be a complete package—so to speak—when running on new pro hardware; draw your inferences. (Luca’s comment on sharing exciting news with current Mac owners may also be interpreted as, new Macs make for a great upgrade.) On a side-note: The leaks of the MacBook Pro from MacRumours’ images—if accurate—have the same ‘MacBook Pro’ label below the screen, facing you, as on the Retina MacBook. I find that label very displeasing. CORRECTION: Luca Maestri, not Tim Cook as I previously stated, made the comment on exciting news for Mac owners.
Jordan Golson, The Verge: Another replacement Samsung Galaxy Note 7 has caught fire, bringing the total to three this week alone. This one was owned by Michael Klering of Nicholasville, Kentucky. He told WKYT that he woke up at 4AM to find his bedroom filled with smoke and his phone on fire. Making fun of Samsung when their first batch of phones were catching fire seemed wrong. This problem could’ve struck any company—although Samsung possibly backed themselves into this position because of a rushed release, wanting to capitalise on a ‘dull iPhone’—so it isn’t fair to add salt to the injury. But Samsung turning a blind eye towards the replacement Note 7s is inexcusable. It’s reached to a point where American carriers are taking matters into their own hands. Additionally, I feel like there must be other cases around the world where Note 7s caught fire; they’re just going unreported. I haven’t seen a single Note 7 advertisement here in India. The two people I know who’ve been on a a flight recently told me their respective airlines asked all passengers to turn off their Note 7s.
Much ado has been made about Apple’s Airpods and all of it can be summed up under two broad categories: Form and function, and pricing. Here’s my thoughts on the Airpods filed under the same groups. Form and function A lot of people complain about the lack of playback controls on the Airpods—a play/pause trigger, at the very least. I probably would too since it’s just one action that is used to simply toggle state (play if paused, pause if playing) so it seems like a trivial demand. All you can do though is double-tap to trigger Siri. Maybe Apple adds play/pause to the Airpods in the future (I have my reservations) but or now, I wanted to understand Apple’s decision for going with just one action.Let’s work our way backwards: a) Tap-and-hold wouldn’t work because the Airpods don’t actually detect a tap the way the iPhone’s screen does. It’s accelerometer picks up on the tap-tap movement to trigger Siri. Surely tap-hold could be implemented, but I doubt it would be as accurate. b) You can’t trigger play/pause with a single tap since you’ll end up triggering play/pause whenever you even graze the Airpod (think about the times you adjust an Airpod in your ear). c) A triple-tap just seems complicated. Think of how many times you’d be telling someone ‘oh you triggered Siri since you didn’t tap fast enough’. On a device with a single input method (one you can’t even see when it’s in operation), this can get frustrating if it’s reoccurring. (At this point, you’re obviously thinking of the button on the wired Earpods. If I call a triple-tap complicated, surely the Earpods’ press-press-press and press-press-hold actions run circles around the Airpods? Yes, but Apple doesn’t need you to be sold on the Earpods the way it does on the Airpods. If you’re frustrated with the button on the Earpods, you pull out your iPhone to complete your task; if you’re frustrated with the Airpods’ tap-tap-slap-whack, you’re might end up getting frustrated and giving up on using Apple’s second wearable platform…and then resorting to using your iPhone anyway). d) Why not just include a button on the Airpods? You might as well be asking for the next version of iOS to support Flash. Alternatively, a double-tap has a very high rate of detection (given that it’s the only action), Siri can lead you to play/pause anyway—at the expense of some comfort to a user who is accustomed to a dedicated play/pause action and—this is key and the reason I have reservations Apple will add a dedicated action for play/pause—including a play/pause action is archaic. When you see sci-fi movies, you don’t see humans pressing a play button to start playing music, they just bark at the always-listening AI. (Let me throw this out there: I think Apple would remove the tap-tap once the Airpods have sufficient battery to support an always-listening Siri.) If you look back to when the Airpods were just a rumour, one of people’s complaints was that they’d have yet another thing to charge. Again, a valid concern, but what matters then is how Apple helps mitigate that pain-point and I think the Airpods handle the mitigation very well. The genius of the Airpods design is that the intent of ‘putting them away’ implicitly implies the Airpods are being charged, since you instinctively reach out to put them back in the case (where they start charging). People listening at long stretches—in a flight, let’s say—aren’t inclined to take their Airpods off frequently but day-to-day usage won’t have you listening to more than 3-4 hours at a stretch. People often listen to their audio-content in breaks. When you’re done, you are inclined to want to keep them back in the case owing to their miniature size. Additionally, you can charge the Airpods’ case while listening to your audio-content. The Airpods are inclined to reduce low-battery grievances design. Owning yet another accessory that you have to charge is added burden whichever way you put it. But knowing that all of them charge via. a single cable cushions the pain a bit. I carry my Mac, my iPhone, and an old pair of Bluetooth headphones to work every day; the headphones require me to carry a micro-USB cable along with the iPhone’s Lightning cable. If I used Airpods, I’d leave yet another wire behind. You’ve heard by now that iOS 10 extends the system to various parts of the OS and third parties can take advantage of that. With Airpods (and the Watch, previously) they’re doing the same for the iPhone’s concept itself. Price: $159 is expensive territory for headphones—let’s just get that out of the way. But I think the percieved expensiveness is driven by the fact that the Airpods are being compared to the wired Earpods they ‘replace’ not to other Bluetooth headphones. Bose’s Bluetooth headphones are considerably more expensive than $159 and what the Airpods lack in sound-quality, they make up for in ease of pairing and—crucially—a better Bluetooth-listening experience. What good are your expensive, awesome-sounding Bose headphones if the connection to your phone is flaky and unreliable? A lot of people think the Airpods are a money grab. I think if that were Apple’s goal, they would’ve tied them in with some form of exclusivity with the new(er) iPhones. The Airpods run with iPhones launched as old as 2012. Lastly, I agree with a lot of what Gruber says in his recent episodes of The Talk Show. Specifically, that Apple is selling the Airpods at a bare-minimum profit margin. I think Apple would have really loved to ship the Airpods in the box but they can’t because the production cost is considerably high. You can’t preach the ‘wireless is the future’ message and include wired headphones with your flagship product; and you can bet Apple knows that.
The RealmTeam: Today, we’re launching the Realm Mobile Platform, a new offering that integrates our fully open-source (see below!) client-side database for iOS and Android with new server-side technology providing realtime synchronization, conflict resolution, and reactive event handling. The new platform makes it easy for mobile developers to create apps with difficult-to-build features like realtime collaboration, messaging, offline-first experiences, and more. Our Android developer and I run our company’s apps on top of Realm’s database1. This is great news for apps that need cross-platform (or simply cross-device) synchronisation. CloudKit handles storage and sync across multiple iOS devices. From what I’ve understood from Realm’s article, their service only manages the syncing—you give up the storage in going cross-platform. And, as I mentioned in the footnote, Realm is far better at being a simple database than CoreData—at least for a beginner like me. Look, I’m sure CoreData has tremendous value as you start picking the onion’s layers apart in need of a deeper, precise functionality but Realm is far more manageable for a basic database. ↩︎
Probably the first Apple Music ad on Apple’s YouTube channel since its introduction—the others were on Beats’ YouTube channel. Subpar ads shouldn’t get away with such long (in ad parlance) durations without catching some flac. If this were playing on TV, I’d change the channel.
Joe Rossignol, MacRumors: DisplayMate Technologies has declared iPhone 7 has the "best LCD display" it has ever tested, calling it "truly impressive" and a "major upgrade" over the iPhone 6 based on a series of advanced viewing tests and measurements. Remember the rumours about Apple shifting towards an OLED screen with the iPhone 7? I don’t doubt the idea of Apple choosing and OLED screen in future iPhones (yet another rumour points to Samsung making an OLED display for the 2017 iPhone) because OLED lets the phone’s display blend-in and become ‘one’ with the rest of the phone as it does on Apple Watch — ‘…you can’t determine a boundary between the physical object and the software’. (Probably no more of those ugly black lines around the iPhone’s display either)It’s fascinating that in all likelihood, Apple’s OLED display is going to be equal to or better than the iPhone 7’s LCD display.
Each year new Apple products are featured on the Indian page with a Coming Soon label (or no label at all) alongside1. This time it reads ‘Available 7 October’—for both, iPhone 7 and Watch Series 2 making it the fastest release in India ever. I’d like to take it as a good sign and say it’s a part of Apple’s continued efforts in pushing into India but I don’t think that’s the only reason. I think it’s more so because Apple is able to crank out iPhones faster than before, that they’re getting better at the manufacturing process. (Jet Black iPhones being the exception.) You still can’t preorder iPhones from Apple online in India since Apple doesn’t sell directly here. Online sellers have had pre-orders in the past but the quickest way to have yourself a new iPhone is by booking it at one of Apple’s partnering stores. I pre-booked my iPhone 6S last year and stood in the line (barely twenty-thirty people, nothing compared to the ones in the West) at midnight. This year, I’ll be doing that for the Apple Watch Series Two. Interestingly, October 7th is a week off the third quarter. When Apple announces iPhone sales for Q3—if they announce it—those numbers won’t include Indian sales. (Maybe that works towards Apple’s favour because Indian vendors offer heavy discounts on consumer electronics during Diwali—one of India’s biggest festivals—scheduled this year at the end of October. That is the period, I assume, non-enthusiasts will flock towards considering a purchase.) I looked through archive.org and the iPhone 6S is the only prominent example I could find. The iPad Pro, for example, launched alongside the iPhone 6S and had a ‘Coming Soon’ label along with it. ↩︎
I’ve been seeing tweets that imply the dual-camera system on the iPhone 7-plus would be great for 3D-photography and possibly add to a VR system at some point in the future. Ever since, I’ve wanted to read more about how the dual-camera system would help usher in 3D and I came across this blog-post by Shutterstock’s CEO Jon Oringer that sheds some light on the matter: A flat lens right in front of a sensor (like a typical camera phone lens) doesn’t optically produce [Depth Of Field]. Today’s camera phones don’t have the ability to measure distance, so they can’t digitally re-create the DOF drama that a conventional lens does on its own. This next photo is more like one taken with a camera phone: Most of the image is in focus and there is little depth or drama to the image. Just as our two eyes work together to detect depth, two lenses do the same. By using the disparity of pixels between two lenses, the camera processor can figure out how far away parts of the image are. […] The magic is how software takes information from the two lenses and processes it into an image. Between the extra data collected from this new hardware, and the advancement of machine vision technology, the new iPhone camera is going to be incredible. Depth of Field is one of the last features necessary to complete the full migration from handheld camera to camera phone. Soon both amateur and professional photographers will only need to carry their mobile devices. Let me illustrate what I’ve understood with an example: Say there are two poles in front of you, one a foot away and another ten feet away. In a dual-camera system, one camera can approximate the (relative) distance of the pole a foot away from you and the other can do the same for the one that’s ten feet away from you. The first camera (the one that has a shorter focus) focuses on the first pole and so it must be closer, implying the second is farther—verified by the fact that the second camera can easily focus on it. Now, if you were to place poles between these two poles, each at a distance of one foot from each other, realising how far each pole is is only a question of picking up on how in-focus or out-of-focus that pole is when seen from both cameras. (Allow me to cook up some random, arbitrary numbers here) A pole that is 90% out-of-focus on the first—near-focused—camera and 10% out of focus is the second camera is the second-furthest pole. In a single-camera system you could only measure the fact that an object is 10% out of focus. Whether that means the object is (the equivalent distance of 10%) further away from you or nearer to you wouldn’t be as easy to determine1. This, in effect, gives you the ability to measure in the third dimension (Length and breadth and now depth). Again, this is an over-simplified illustration of what I’ve understood. I could be wrong—I’m not very well familiar with the academics of optics. I’d be happy to correct any bit that I got wrong. (Back in 2011, HTC released a phone called HTC Evo 3D that had a dual-camera system and allowed you to capture a 3D image and view that image on its auto-stereoscopic display. I suppose it was a clunky experience; it never really took off.) Perhaps you could by calculating the time light takes to bounce off an object—further the distance longer the light takes to reach the camera. I assume this is how the LED-assisted focus systems such as L.G.’s G3 work. Also note that Google’s camera app used to figure out the relative distance of objects through a single camera system too but you needed to move the camera around your object as you would for a panorama. ↩︎
The new iPhones made a big leap in camera performance this year, as is the case every year. This time, among the feature list is the addition of a DCI-P3 compliant camera and display. As per custom, Apple showcased some images in their original state to show off the camera’s prowess and I think I noticed some similarities in the photos. You can go have a look at the photos on Apple’s website. Each photo is, generally, highlighting one of two properties—great photos in low-light and pronounced colour in the image; sometimes even a mixture of the two. But to me the absolute stunning photos are the ones where the colour is rich, there’s a healthy variation and a lot of mix-and-match of it. (Have a look at the image of the fair in this BuzzFeed article too.) I think DCI-P3 is playing a large part here1 and these pictures were taken to leverage the new color space. Perhaps what makes these images look brilliant is the fact that one’s eyes are able to pick up on and distinguish between the various colours in these pictures and in the process appreciate the nuance of colour. If that is the case, the difference in iPhones 6S and iPhones 7 cameras would be less apparent in a bland shot that doesn’t have a lot of color variations in the image when compared to—let’s say—a photograph of a rich landscape. I can’t be completely sure about my reasoning of course since I don’t have a side-by-side comparison, I haven’t seen the phones in person, and I am viewing these pictures on a display that doesn’t support DCI-P3 but I am quite confident these pictures were chosen with the new color spec in mind. Of course there’s a lot of work by the other components in the camera system here–the ISP, the sensors etc. ↩︎
The best coverage I’ve seen on Apple’s Irish tax debacle comes (I am restraining myself from using an ‘of course’) from The New Yorker $: People in the know—there aren’t many—simply call it A.O.I., short for Apple Operations International. And this version of Apple is much harder to pin down; it’s something like a quantum corporation whose very nature depends on who is observing it. A.O.I. is, in one sense, huge, among the largest companies that ever existed, with more than two hundred billion dollars in assets. It is also as small as a company can be, with no physical address and no employees. Phillip Bullock, the head of tax operations for Apple, told a U.S. Senate committee in 2013 that “A.O.I. is incorporated in Ireland; thus, under U.S. law it is not tax resident in the U.S.” That seemed clear enough until his next sentence. “A.O.I. is also not tax resident in Ireland because it does not meet the fact-specific residency requirements of Irish law.” It’s Irish, according to American law; not Irish, according to the Irish. A.O.I., in fact, does not legally exist anywhere, even as it takes in much of the profits from Apple sales outside of the United States. You should go read the article while it’s still relevant. Come 7th and removing the headphone jack will be the talk of the town. The article may be pay-walled. ↩︎