‘Gather Round’, and Adventures of a Curious Character

Two events make me write again, today: Apple’s September event, and finishing one of the books I’ve been reading: ’“What Do You Care What Other People Think?”: Further Adventures of a Curious Character’, a book by Richard Feynman. These have given me the itch to write and here I am, scratching it.


The stuff that doesn’t immediately affect me

The new iPhones are nice. I have nothing to contribute towards the new iPhones X S that you haven’t already encountered through other people’s work. As most of the internet, I find the iPhone X R to be the most interesting of the three iPhones.

I find it very hard to recommend things to people (recommending books to me, is tougher than recommending gadgets) since I want the recommendations I make to be suited to the person’s tastes. I think the iPhone X R makes for an easy recommendation, without knowing anything about the person in advance.

I’m also very happy that, for the first time, the new iPhones are coming to India the same month they were announced. I’ll be sure to get my hands on the Max, and the R, in person.


The stuff that does

I’ve long been waiting for a new Apple Watch. My Series 2 is pretty slow; in my books, it’s barely functional. I’ve seen the Series 3 and I envy its users for how fast it is. But I generally like to stick to (at least) a two year cycle with my Apple gadgets, so I decided 2018 would be the year I’ll buy an Apple Watch.

The wait was worth it, and then some! Series 4 was the biggest increment over its predecessors, this September. I don’t understand the ECG sensor, or what it’s application would be in my life, but this is ignorance on my part. Regardless, I fully appreciate the technical achievement here, and I’m really happy Apple’s doing work to help people live a better day.

Of the health features announced, my favourite was fall detection. Specially the bit where if the Watch detected a fall, it would automatically call for help on your behalf after a minute of immobility—the thought behind this feature warms my heart.

Although, personally, if I fell, I feel like I’d now have the added burden of turning off the Watch’s one-minute timer before it starts calling people on my behalf…in addition to dealing with my fall.

Three things I’m really looking forward to with this upgrade: A faster Watch, leaving my phone behind (since I’m getting the LTE version), and Siri talking back to me.

One thing I’m really not looking forward to: the increased price.


The missing mat, and AirPods

I was looking forward to AirPower since it was announced last year. I don’t like the clunkiness of wires for charging my iPhone, AirPods, and Apple Watch. I charge my iPhone wirelessly already, but my AirPods, and Watch still need wires to charge them.

Additionally, my AirPods barely cut it for me. Their batteries—both the pods, and the case—discharge much faster than when I bought them. Nothing out of the ordinary battery degradation (specially given the small batteries in the pods) but I’d still prefer a new set of AirPods. I’m also plagued by a bigger issue where the mics on the AirPods barely work.

I thought I’d be picking up a new set of AirPods this year…and we got nothing.

There’s speculation that the absence of new AirPods (I still think there’s something off about that opening sequence in the event) ties in with the absence of AirPower but I don’t think so. If either one of the rumoured features comes to the AirPods—handsfree Siri, or water resistance—it’s enough to justify a new set of AirPods, given the small size of the AirPods affords Apple less wiggle room. Perhaps the delay is because of getting both (or more) features in the AirPods? I’m not sure here.

I’m willing to bet there’s an event (or a set of press releases) coming our way this year for new Apple products that didn’t make the cut for this keynote. If we don’t get AirPods even then, I’m going to have to consider buying a new set of the current generation of AirPods.

The possibility of a second hardware event this year sure is exciting. I’m looking forward to see what Apple has in store for the new Macs, and the iPad Pros (in that order). Hopefully there’s new AirPods too.


Also: The Feynman book was great!

In Uncategorized by Mayur Dhaka

Reservations on going iPad-first: Impressions of the 10.5″ iPad Pro

I recently got a 10.5″ iPad Pro–an upgrade from my iPad Air 2. My Air 2 was a device reduced to being a consumption device: watching YouTube, Netflix, reading content, etc. The Air 2 had was noticeably slow, and I’ve mentioned previously {Find previous mention and insert here} how I found the screen just a little too small to comfortably type on. I’d had my eyes on the iPad Pro since a while (9.7-inch, at the time) and my primary use would be jotting notes with the Pencil while watching/reading some educational content such as a WWDC video. This was roughly back when the 9.7″ Pro was mid-way through its lifecycle and rumours of a larger-screened iPad were afloat. I decided to wait it out for the new iPad.

Come 8th of July, I was pulling a 256GB 10.5″ Pro out of the box, with an Apple Pencil that would follow. I was delighted to find the on-screen keyboard was the perfect size–for me–to type on. I loved the display(firsts for me: P3 colour gamut, True Tone, and ProMotion), and the faster TouchID. Everything is so fast on this iPad–faster than any computing device I have ever used.

Soon enough, my brain came over the pencil-on-paper to Apple Pencil-on-iPad chasm too and I was taking notes in Goodnotes opened in Split View alongside the WWDC app. This particular workflow has become second-nature for me. But I was still missing one accessory: The Smart Keyboard.

In anticipation, I connected my old Magic Keyboard to the iPad Pro to test how Swift Playgrounds behaves on the iPad with a keyboard attached since I am really averse to the way Playgrounds tries to autocomplete everything on your part1. I was relieved to find that Playgrounds isn’t pushy about helping you out if you’re typing on a physical keyboard. Using Playgrounds on the iPad is important to me because I want to actively use the iPad for development as much as possible.
Why? And why do I not want to stick to the Mac–a platform that is very mature for the kind of use cases I am looking for? I don’t know. I could tell you that my Mac feels really slow2 (especially when working on the iOS app I make), that it doesn’t have the display chops of the iPad Pro, that it lacks the portability, etc. But I won’t want to work on it any more if the Mac did offer those things to me. I suppose it does subconsciously come down to not having a direct-manipulation interface but I can’t say for certain and I’ll reserve commentary on that bit.

Coming back, I had multiple reasons to be looking forward to the Smart Keyboard for the Pro and I was lucky enough to finally find one a couple of days ago. My immediate thought after using it for a few minutes was that it is nothing like attaching the Magic Keyboard to a propped-up iPad and working that way. It comes down to the difference between being a part of your work machine, vs. an accessory that’s being tagged along. It’s a world of a difference to me.

Before I had the Pro, I thought my primary use-case for this device would be note-taking with the Pencil–given how much I (used to) constantly jot things down in a notebook. But ever since the Smart Keyboard, the ‘default-mode’ for the Pro, in my mind, has become the docked-onto-the-keyboard mode. I thought I’d be fiddling with the Smart Keyboard’s Origami-ish nature and very rarely–and with determined intent–want to dock it that way to type on. Now, though, the docked-onto-the-keyboard mode has become default. I have to actively pause and think, and be faced with the Smart Keyboard’s origami-ish fiddliness to set the iPad in the resting-flat-on-desk mode to jot down notes with the Pencil.

Quick word on the QuickType Suggestions: I think it’s super useful. When using Playgrounds on the iPad, the suggestions offered are ones of code autocompletion. I actively rely on them for completion of whatever I’m typing. And the placement of the bar right above the Smart Keyboard, assisted by the thin side bezels of the iPad, make selecting a suggestion as easy as making a mistake. This, compared to the TouchBar on the MacBooks, is a better approach. You can’t see the TouchBar’s prompts if you’re slumping in your chair given that it lies flat with the keyboard, and selecting anything on the TouchBar is harder still. Contrasted with the Pro where the QuickType suggestions bar is facing you, you can always see the suggestions and they’re easy to select. But I digress…

They Smart Keyboard really is the reason why I titled this article ‘Reservations on going iPad-first’. I can’t be iPad-only because I’m an iOS developer and I need Xcode–at the very least–to write apps with. But that doesn’t prevent me from being iPad-first. Apart from Xcode–or something tangentially related such as the Terminal–I use my MacBook extremely rarely. Gruber mentioned having the feeling of ‘a hand being untied from behind his back’ when using iOS 11 on the iPad Pro. That’s precisely how I feel when using the Smart Keyboard on the iPad Pro. Having the ability to control an app’s UI with the keyboard lets me blaze through tasks3 . But I am hesitant of getting used the iPad’s way of doing things with keyboard shortcuts because there are some notable differences from those on the Mac.

On the Mac, I use Alfred to open pretty much anything. I trigger it with Option+Space, type what I’m looking for and press return to open it. On the iPad, I am forced to use Spotlight (not that I mind because Spotlight on iOS actually works) by tapping Command+Space. Switching between two full screen apps on iPad is Command+Tab; on the Mac, I prefer using Spaces and switch between them with the four-finger swipe gesture. These are just a few of the ways that being a fluent keyboard navigator on one platform stops me from being fluent on the other. If I had to ditch one device of the two (given that I had Xcode on the iPad at least), the Mac goes. iPad is the future of computing, its present state is also really nice and makes me want to use it almost always. But the current situation of both devices makes it so that I can’t get completely comfortable with either of them.


All this might give you the impression that I don’t like my Mac at all. I do, a lot, specially because I still have the impression it imparted on me when I switched from years of using Windows. The Mac will always hold a special place in my heart that maybe nothing can ever replace. It’s just that I like the iPad as a day-to-day device way more (for the tasks it can perform) and it’s way less challenging.

While we’re at it, here’s a few more thoughts on the Smart Keyboard:

  1.  I was in disbelief when I noticed the lack of an Escape key on this keyboard since my Desktop sensibilities have me hunt for the Escape key to dismiss a screen I’m looking at. But it’s probably better off this way since the iPad is ushering in a new paradigm and it doesn’t make sense to bring over legacy technology just for the sake of familiarity. Bringing over Command, Option, and Control makes sense because they aid in using keyboard shortcuts.
  2. Probably my biggest complaint with the keyboard is that I don’t have a way to tell whether Caps Lock is on, or not, without actually typing away in a text field first. Would I like the Caps Lock key to have a green light indicating ‘You have Caps Lock turned on’? No, that’s a Mac thing.
  3. When typing in the message field in a text field, you can’t type the text ‘iPad’ as your first word. iOS auto-corrects it to ‘IPad’ all the time if it’s the first word: Super irritating.
  4. I’d absolutely like a tiny track surface around (on?) the Smart Keyboard that allows me to a) Select text and move around the cursor, just like iOS does on the software keyboard and b) Pan around in a view that can be scrolled–A web page in Safari, a list of items, etc.
    This would drastically reduce me being forced to interact with the touch screen, without bringing in the confusion of a pointer.
  5. iPad, docked in typing mode, is incredibly sturdy. Poking at the screen doesn’t have any give to it. The same can’t be said of the viewing mode (when the keyboard is tucked away behind-and-against the iPad). The angle just doesn’t seem right and it feels like the iPad will just topple forward on its screen.
  6. I hate the fact that the micro-fibre material that lines the under-side of the keyboard and rests directly on the iPad’s screen when folded up in ‘off’ mode, also rests on the surface when in ‘docked-to-type’ mode. Thinking about the micro-fibre catching all kinds of crap (and, god forbid, liquids) from the surface it’s resting on is the stuff of nightmares.

  1. It’s a 2014 base-configuration 13″ Retina MacBook Pro–not the best Mac to be writing apps on anyway. ↩︎
  2. I understand why Playgrounds on the iPad does this–to make touch-typing easier for beginners. But I’m not the intended audience for this feature: I have muscle memory and habits ingrained in me from the Mac paradigm. Playgrounds’ aggressive auto-completion works against this habit for me.  ↩︎
  3. As a result, I have become very cognisant of Apple’s apps that don’t support keyboard shortcuts. Most notable culprit: Apple Music. ↩︎

In Uncategorized by Mayur Dhaka

On Control Center’s Norman Door problem

Michael McWatters, on Medium, illustrating the Norman Door problem with iOS 10’s split-up Control Center:

You swipe up to reveal the Control Center, then swipe sideways between the two panels. This has created a couple usability problems:

  • The only indication that there’s another hidden panel is the carousel-style dots at the bottom of the screen; these are tiny and easy to miss, so users may not even know there’s more than one panel.
  • Once you swipe to a given panel, you have to swipe in the opposite direction to go back, and this is where the Norman Door problem arises.

Here’s how this plays out: you swipe up to reveal the Control Center’s Home panel, the default. You need the Now Playing panel, but you’re not sure where it’s hiding, so you swipe right. Wrong! That’s a dead end, but you had a 50% chance so better luck next time!

Later, you want to go back to the Home panel, but you’ve forgotten whether it’s off screen to the left or right, so you swipe left again. Wrong! Another dead end. You swipe right and you’re in the right place.

His proposed solution:

[…]treat the two panels as an infinite alternating carousel: no matter which panel you’re on, the other panel is off-screen on either side.

While reading Micheal’s article, I was constantly reminded of the way Apple Music solves this problem of indicating ‘There’s more content to the other side’ by showing the user just a bit of said content from the edge of the screen. (If you’re an Apple Music subscriber, open the Music app and switch to the For You tab to have a look for yourself.)

Instead, Micheal proposes the edge-revelation method I just described when his readers pointed out that people who use Home-Kit devices have a third Control Center panel. In this case the infinite carousel solution breaks because the user is forced to either hit-and-try, or memorisation (until they’re habituated—of course) to get to the panel they want.

But I think the infinite-carousel solution is flawed even if there weren’t the problem of a third panel. Here’s why:

  • The metaphor being used by this panel/screen-swipe gesture is that of flipping the pages of a book. (For iOS apps, the related element is called ‘UIPageViewController’.) It has a sense of familiarity with the analog world where flipping (swiping) forward gets you to the ‘next’ object. Keep going ‘next’ and you reach the end. You never encounter the same object again—like you would in a carousel. If you’re thinking, ‘Yes, but people are comfortable with the digital world now; designers can come up with metaphors native to the digital world’, you’d be right. Allow me, then, to put forth my second argument.
  • Doing an infinite-carousel would break familiarity with other parts of iOS. Consider, for example, the iOS lock-screen—it uses the same paginated control. Lock-screen (with notifications) at the center, widgets and camera to the left and right, respectively. An infinite carousel solution with these screens seems incredibly bleak—even if you chose just two screens.
    It would be confusing for a user to have to remember two pagination paradigms—one that scrolls in a loop and one that doesn’t. (I suspect this could be the reason why playing music in a playlist doesn’t make songs loop by default; it’s only due to the nature of music listening that music apps include a loop option).

The problem still stands though: Control Center does need some attention, and of the existing solutions, I too think the shrink-panes-to-reveal-other-panes solution is the best.

In Uncategorized by Mayur Dhaka

Apple looking to set up a distribution center in India

Husain Sumra, MacRumors:

The distribution center, which Apple’s global logistics partner DB Schenker will own and operate, will be in the city of Bhiwandi, near the city of Mumbai. An unnamed executive told The Economic Times that the center will “allow Apple to stock its products adequately, will ease operations and streamline its logistics and supply chains.” It will also help Apple maintain consistent pricing for its products.

Apple products are pretty much always out of stock here unless they’re a) very popular like the iPhone or b) not popular at all, like the Apple TV. (The iPhone 7 Plus was barely in stock on launch-day and Apple’s retail partners still have the MacBook 101 on display.)

Buying an Apple Watch was a terrible experience for me, partly because of the sub-standard retail experience and partly due to the fact that no one had stock. I haven’t found a single place I can buy myself Apple’s woven nylon bands since then—retail or online. I suppose this is some consolation?

In Uncategorized by Mayur Dhaka

The Verge: ‘Moto pushes off smartwatches indefinitely’

Dan Seifert, The Verge:

The company had earlier said it would not be releasing a new smartwatch in 2016, but it is now saying that it doesn’t plan to put out a new device timed to the arrival of Google’s newest wearable platform, either.

Shakil Barkat, head of global product development at Moto, said the company doesn’t “see enough pull in the market to put [a new smartwatch] out at this time,” though it may revisit the market in the future should technologies for the wrist improve.

This is one of the perks you get by being in the Apple ecosystem—modern Apple either doesn’t enter a segment or it pushes heavily forward despite slumping sales. The iPad is a good example.

The Moto 360 was one of the better, if not the best, Android smartwatch. I’ve hardly seen people get enthusiastic about Android smartwatches lately, the way they do about Fitbits or Apple Watches.

I love my Apple Watch and Apple’s direction for it seems ambitious too. It’s reassuring to know that Apple seems committed to the Watch. I’ve changed my interactions with my iPhone (I use it a lot less now) to incorporate the watch and I like it better this way. I’m glad I don’t see myself in the position some pro users do with their Macs.

In Uncategorized by Mayur Dhaka

Adam Geitgey: ‘Machine Learning is Fun!

Adam, in a post on Medium:

This guide is for anyone who is curious about machine learning but has no idea where to start. I imagine there are a lot of people who tried reading the wikipedia article, got frustrated and gave up wishing someone would just give them a high-level explanation. That’s what this is.

Machine learning is a domain I want to learn about as a hobby. If you’ve been inquisitive about Machine learning, Adam’s article is a nice resource to get a basic idea. There’s some very basic code written here. A little fact to get you interested: the iPhone 7’s camera uses machine learning when you click an image to find the best picture.

In Uncategorized by Mayur Dhaka

NYT: “The Danger of a Dominant Identity”

David Brooks writing in The New York Times’ Op-Ed Column:

Large parts of popular culture — and pretty much all of stand-up comedy — consist of reducing people to one or another identity and then making jokes about that generalization. The people who worry about cultural appropriation reduce people to an ethnic category and argue that those outside can never understand it. A single identity walls off empathy and the imagination.

We’re even seeing a wave of voluntary reductionism. People feel besieged, or they’re intellectually lazy, so they reduce themselves to one category.


The only way out of this mess is to continually remind ourselves that each human is a conglomeration of identities: ethnic, racial, professional, geographic, religious and so on. Even each identity itself is not one thing but a tradition of debate about the meaning of that identity. Furthermore, the dignity of each person is not found in the racial or ethnic category that each has inherited, but in the moral commitments that each individual has chosen and lived out.

Not thinking of people’s choices, characters, likes and dislikes as a binary ‘this-or-that’ is one of the most important things I’ve learned and I wanted to share with you this article (as I struggle to find time to write more often) since it’s a big step along this line of thought.

I think this form of either-this-or-that classification exists so much in the tech world too—you’re either among the tech-elite or not1. I’m guilty of writing writing this way too and I’d like to change that.
Sure, a common agreement could be that a person can lie somewhere in the middle, be a combination of the two extremes but that still seems like a disservice to our complex and diverse nature. You could be part tech-savvy and part not, but you are also part woman, man, grown-up, childlike, smart, foolish, extroverted, introverted etc. and I think those parts of you add-up and influence your decisions more than an aggregated summary of you would suggest.

While it stands to reason that in writing about technology2 it seems convenient to refer to people as ‘tech-people’ or ‘non-tech-people’, I hope that practice doesn’t carry into our understanding of those people.

  1. I’ve heard the phrase ‘technologically-challenged’ thrown around constantly. ↩︎
  2. Or any other category—the original NYT story is about the American elections and how the pollsters were way off-mark. ↩︎

In Uncategorized by Mayur Dhaka

Refreshed software at the upcoming Mac event

I don’t think it’s a stretch to call today’s Town Hall event the ‘Mac event’ considering all the leaks and the ‘hello, again’ wording on the press invites. But among all the talk about new Macs, somewhere, there are hints at refreshed (Pro) software being announced at the event too. First, a tweet from Mark Gurman:

A Final Cut Pro website was invited to next week’s Apple event. So yes, a big day for video editors.

And second, from Luca Maestri–Apple’s CFO–during the quarterly earnings report (emphasis mine):

[…]and we’ll have some exciting news to share with current and future Mac owners very soon.

Pro software running on a refreshed MacBook Pro would speak to the power of the new hardware but it would be a complete package—so to speak—when running on new pro hardware; draw your inferences.

(Luca’s comment on sharing exciting news with current Mac owners may also be interpreted as, new Macs make for a great upgrade.)

On a side-note: The leaks of the MacBook Pro from MacRumours’ images—if accurate—have the same ‘MacBook Pro’ label below the screen, facing you, as on the Retina MacBook. I find that label very displeasing.

CORRECTION: Luca Maestri, not Tim Cook as I previously stated, made the comment on exciting news for Mac owners.

In Uncategorized by Mayur Dhaka

Third replacement Note 7 catches fire this week

Jordan Golson, The Verge:

Another replacement Samsung Galaxy Note 7 has caught fire, bringing the total to three this week alone. This one was owned by Michael Klering of Nicholasville, Kentucky. He told WKYT that he woke up at 4AM to find his bedroom filled with smoke and his phone on fire.

Making fun of Samsung when their first batch of phones were catching fire seemed wrong. This problem could’ve struck any company—although Samsung possibly backed themselves into this position because of a rushed release, wanting to capitalise on a ‘dull iPhone’—so it isn’t fair to add salt to the injury. But Samsung turning a blind eye towards the replacement Note 7s is inexcusable. It’s reached to a point where American carriers are taking matters into their own hands.

Additionally, I feel like there must be other cases around the world where Note 7s caught fire; they’re just going unreported.

I haven’t seen a single Note 7 advertisement here in India. The two people I know who’ve been on a a flight recently told me their respective airlines asked all passengers to turn off their Note 7s.

In Uncategorized by Mayur Dhaka

Assorted thoughts on the Airpods

Much ado has been made about Apple’s Airpods and all of it can be summed up under two broad categories: Form and function, and pricing.

Here’s my thoughts on the Airpods filed under the same groups.

Form and function

  • A lot of people complain about the lack of playback controls on the Airpods—a play/pause trigger, at the very least. I probably would too since it’s just one action that is used to simply toggle state (play if paused, pause if playing) so it seems like a trivial demand. All you can do though is double-tap to trigger Siri.
    Maybe Apple adds play/pause to the Airpods in the future (I have my reservations) but or now, I wanted to understand Apple’s decision for going with just one action.Let’s work our way backwards:
    a) Tap-and-hold wouldn’t work because the Airpods don’t actually detect a tap the way the iPhone’s screen does. It’s accelerometer picks up on the tap-tap movement to trigger Siri. Surely tap-hold could be implemented, but I doubt it would be as accurate.
    b) You can’t trigger play/pause with a single tap since you’ll end up triggering play/pause whenever you even graze the Airpod (think about the times you adjust an Airpod in your ear).
    c) A triple-tap just seems complicated. Think of how many times you’d be telling someone ‘oh you triggered Siri since you didn’t tap fast enough’. On a device with a single input method (one you can’t even see when it’s in operation), this can get frustrating if it’s reoccurring.
    (At this point, you’re obviously thinking of the button on the wired Earpods. If I call a triple-tap complicated, surely the Earpods’ press-press-press and press-press-hold actions run circles around the Airpods? Yes, but Apple doesn’t need you to be sold on the Earpods the way it does on the Airpods. If you’re frustrated with the button on the Earpods, you pull out your iPhone to complete your task; if you’re frustrated with the Airpods’ tap-tap-slap-whack, you’re might end up getting frustrated and giving up on using Apple’s second wearable platform…and then resorting to using your iPhone anyway).
    d) Why not just include a button on the Airpods? You might as well be asking for the next version of iOS to support Flash.

    Alternatively, a double-tap has a very high rate of detection (given that it’s the only action), Siri can lead you to play/pause anyway—at the expense of some comfort to a user who is accustomed to a dedicated play/pause action and—this is key and the reason I have reservations Apple will add a dedicated action for play/pause—including a play/pause action is archaic. When you see sci-fi movies, you don’t see humans pressing a play button to start playing music, they just bark at the always-listening AI. (Let me throw this out there: I think Apple would remove the tap-tap once the Airpods have sufficient battery to support an always-listening Siri.)

  • If you look back to when the Airpods were just a rumour, one of people’s complaints was that they’d have yet another thing to charge. Again, a valid concern, but what matters then is how Apple helps mitigate that pain-point and I think the Airpods handle the mitigation very well. The genius of the Airpods design is that the intent of ‘putting them away’ implicitly implies the Airpods are being charged, since you instinctively reach out to put them back in the case (where they start charging). People listening at long stretches—in a flight, let’s say—aren’t inclined to take their Airpods off frequently but day-to-day usage won’t have you listening to more than 3-4 hours at a stretch. People often listen to their audio-content in breaks. When you’re done, you are inclined to want to keep them back in the case owing to their miniature size. Additionally, you can charge the Airpods’ case while listening to your audio-content. The Airpods are inclined to reduce low-battery grievances design.
  • Owning yet another accessory that you have to charge is added burden whichever way you put it. But knowing that all of them charge via. a single cable cushions the pain a bit. I carry my Mac, my iPhone, and an old pair of Bluetooth headphones to work every day; the headphones require me to carry a micro-USB cable along with the iPhone’s Lightning cable. If I used Airpods, I’d leave yet another wire behind.
  • You’ve heard by now that iOS 10 extends the system to various parts of the OS and third parties can take advantage of that. With Airpods (and the Watch, previously) they’re doing the same for the iPhone’s concept itself.


  • $159 is expensive territory for headphones—let’s just get that out of the way. But I think the percieved expensiveness is driven by the fact that the Airpods are being compared to the wired Earpods they ‘replace’ not to other Bluetooth headphones. Bose’s Bluetooth headphones are considerably more expensive than $159 and what the Airpods lack in sound-quality, they make up for in ease of pairing and—crucially—a better Bluetooth-listening experience. What good are your expensive, awesome-sounding Bose headphones if the connection to your phone is flaky and unreliable?
  • A lot of people think the Airpods are a money grab. I think if that were Apple’s goal, they would’ve tied them in with some form of exclusivity with the new(er) iPhones. The Airpods run with iPhones launched as old as 2012.
  • Lastly, I agree with a lot of what Gruber says in his recent episodes of The Talk Show. Specifically, that Apple is selling the Airpods at a bare-minimum profit margin. I think Apple would have really loved to ship the Airpods in the box but they can’t because the production cost is considerably high. You can’t preach the ‘wireless is the future’ message and include wired headphones with your flagship product; and you can bet Apple knows that.

In Uncategorized by Mayur Dhaka

Realm announces realtime syncing

The RealmTeam:

Today, we’re launching the Realm Mobile Platform, a new offering that integrates our fully open-source (see below!) client-side database for iOS and Android with new server-side technology providing realtime synchronization, conflict resolution, and reactive event handling. The new platform makes it easy for mobile developers to create apps with difficult-to-build features like realtime collaboration, messaging, offline-first experiences, and more.

Our Android developer and I run our company’s apps on top of Realm’s database1. This is great news for apps that need cross-platform (or simply cross-device) synchronisation.

CloudKit handles storage and sync across multiple iOS devices. From what I’ve understood from Realm’s article, their service only manages the syncing—you give up the storage in going cross-platform. And, as I mentioned in the footnote, Realm is far better at being a simple database than CoreData—at least for a beginner like me.

  1. Look, I’m sure CoreData has tremendous value as you start picking the onion’s layers apart in need of a deeper, precise functionality but Realm is far more manageable for a basic database. ↩︎

In Uncategorized by Mayur Dhaka

DisplayMate, on the iPhone 7: Best LCD display

Joe Rossignol, MacRumors:

DisplayMate Technologies has declared iPhone 7 has the "best LCD display" it has ever tested, calling it "truly impressive" and a "major upgrade" over the iPhone 6 based on a series of advanced viewing tests and measurements.

Remember the rumours about Apple shifting towards an OLED screen with the iPhone 7? I don’t doubt the idea of Apple choosing and OLED screen in future iPhones (yet another rumour points to Samsung making an OLED display for the 2017 iPhone) because OLED lets the phone’s display blend-in and become ‘one’ with the rest of the phone as it does on Apple Watch — ‘…you can’t determine a boundary between the physical object and the software’. (Probably no more of those ugly black lines around the iPhone’s display either)
It’s fascinating that in all likelihood, Apple’s OLED display is going to be equal to or better than the iPhone 7’s LCD display.

In Uncategorized by Mayur Dhaka

The iPhone release date in India

Each year new Apple products are featured on the Indian page with a Coming Soon label (or no label at all) alongside1. This time it reads ‘Available 7 October’—for both, iPhone 7 and Watch Series 2 making it the fastest release in India ever. I’d like to take it as a good sign and say it’s a part of Apple’s continued efforts in pushing into India but I don’t think that’s the only reason. I think it’s more so because Apple is able to crank out iPhones faster than before, that they’re getting better at the manufacturing process. (Jet Black iPhones being the exception.)

You still can’t preorder iPhones from Apple online in India since Apple doesn’t sell directly here. Online sellers have had pre-orders in the past but the quickest way to have yourself a new iPhone is by booking it at one of Apple’s partnering stores. I pre-booked my iPhone 6S last year and stood in the line (barely twenty-thirty people, nothing compared to the ones in the West) at midnight. This year, I’ll be doing that for the Apple Watch Series Two.

Interestingly, October 7th is a week off the third quarter. When Apple announces iPhone sales for Q3—if they announce it—those numbers won’t include Indian sales.
(Maybe that works towards Apple’s favour because Indian vendors offer heavy discounts on consumer electronics during Diwali—one of India’s biggest festivals—scheduled this year at the end of October. That is the period, I assume, non-enthusiasts will flock towards considering a purchase.)

  1. I looked through archive.org and the iPhone 6S is the only prominent example I could find. The iPad Pro, for example, launched alongside the iPhone 6S and had a ‘Coming Soon’ label along with it. ↩︎

In Uncategorized by Mayur Dhaka

The potential of a dual-camera system in the iPhone 7 Plus

I’ve been seeing tweets that imply the dual-camera system on the iPhone 7-plus would be great for 3D-photography and possibly add to a VR system at some point in the future. Ever since, I’ve wanted to read more about how the dual-camera system would help usher in 3D and I came across this blog-post by Shutterstock’s CEO Jon Oringer that sheds some light on the matter:

A flat lens right in front of a sensor (like a typical camera phone lens) doesn’t optically produce [Depth Of Field]. Today’s camera phones don’t have the ability to measure distance, so they can’t digitally re-create the DOF drama that a conventional lens does on its own. This next photo is more like one taken with a camera phone: Most of the image is in focus and there is little depth or drama to the image.

Just as our two eyes work together to detect depth, two lenses do the same. By using the disparity of pixels between two lenses, the camera processor can figure out how far away parts of the image are. […]

The magic is how software takes information from the two lenses and processes it into an image. Between the extra data collected from this new hardware, and the advancement of machine vision technology, the new iPhone camera is going to be incredible. Depth of Field is one of the last features necessary to complete the full migration from handheld camera to camera phone. Soon both amateur and professional photographers will only need to carry their mobile devices.

Let me illustrate what I’ve understood with an example: Say there are two poles in front of you, one a foot away and another ten feet away. In a dual-camera system, one camera can approximate the (relative) distance of the pole a foot away from you and the other can do the same for the one that’s ten feet away from you. The first camera (the one that has a shorter focus) focuses on the first pole and so it must be closer, implying the second is farther—verified by the fact that the second camera can easily focus on it.

Now, if you were to place poles between these two poles, each at a distance of one foot from each other, realising how far each pole is is only a question of picking up on how in-focus or out-of-focus that pole is when seen from both cameras. (Allow me to cook up some random, arbitrary numbers here) A pole that is 90% out-of-focus on the first—near-focused—camera and 10% out of focus is the second camera is the second-furthest pole.
In a single-camera system you could only measure the fact that an object is 10% out of focus. Whether that means the object is (the equivalent distance of 10%) further away from you or nearer to you wouldn’t be as easy to determine1. This, in effect, gives you the ability to measure in the third dimension (Length and breadth and now depth).

Again, this is an over-simplified illustration of what I’ve understood. I could be wrong—I’m not very well familiar with the academics of optics. I’d be happy to correct any bit that I got wrong.

(Back in 2011, HTC released a phone called HTC Evo 3D that had a dual-camera system and allowed you to capture a 3D image and view that image on its auto-stereoscopic display. I suppose it was a clunky experience; it never really took off.)

  1. Perhaps you could by calculating the time light takes to bounce off an object—further the distance longer the light takes to reach the camera. I assume this is how the LED-assisted focus systems such as L.G.’s G3 work.
    Also note that Google’s camera app used to figure out the relative distance of objects through a single camera system too but you needed to move the camera around your object as you would for a panorama. ↩︎

In Uncategorized by Mayur Dhaka