Tutorial #19: Optimally managing photos and videos on iOS


Apple’s anaemic storage options (specifically at the lowest tier) have been a running joke for a majority of the iPhone’s existence. I was at the receiving end of Apple’s largesse with the entry-level iPhone 7 switching to 32 GB which was further enhanced to 64 GB with the iPhone 8. Even with these storage options, one can easily fill it up with content other than captured photos and videos. Also, it is not a sound idea to have all your files stored locally on the device, irrespective of its storage capacity.

Apple provides a few options to mitigate the storage issues resulting from ever larger multimedia content. These are as follows:

1. The default option that most may take recourse to is the iCloud Photo Library. However, Apple provides a meagre 5 GB of storage for starters and as is typical of the company, you are expected to pay more to use this option practically. It only makes sense to go with Apple’s cloud if you live on it through other devices in the Apple ecosystem, like the Mac or iPad. The more important thing to note here is that by default Apple syncs your local photo library with the iCloud one, so you can end up permanently deleting your photos from the device as well as the cloud if you are not paying attention.

2. With the release of iOS 11, Apple introduced the high-efficiency formats, HEVC for videos and HEIF for images, that significantly reduce the file size on modern iDevices. The down side to this is that compatibility for this format is still not standard across platforms and devices. Most notably, the HEIF format is not yet natively supported by Windows. Even within Apple’s ecosystem, sharing the images or videos with older devices necessitates a conversion which takes up time as well as processing power on the mobile device.

3. Lastly, iOS also provides an ‘Optimize Storage’ option that keeps a lower quality version of the image on the phone for immediate viewing purpose while retrieving the full quality image from iCloud. This helps in dealing with storage issues but yet again results in the usage of additional time and data.

Luckily for iOS users, there are several third-party options available that allow one to back-up and retrieve photos and videos without having to pay or worry about running out of storage. After using quite a few options, I have shortlisted two well-known ones that together offer an unbeatable 1-2 combination. They are Flickr and Google Photos.

Before starting out, I would recommend that you go to Camera > Formats and select the ‘Most Compatible’ option which uses JPEG/H.264 instead of HEIF/HEVC. This ensures that the images are available for use without any conversion and accessible on all platforms. It will, of course, take up additional space but since we are offloading most of the stuff on to the cloud anyway, storage isn’t a constraint. On the other hand, data usage can be a constraint if you are limited to a cellular network, but the solution here ensures that even that eventuality is covered. As for the ‘Optimize Storage’ option, you can leave it enabled as iOS always provides the full quality image to any other app that requests it.

Our primary solution to the image storage problem is Flickr. One can argue that Flickr has seen better days and the Yahoo hacks might have left a few people dishevelled. Many photographers might have a preference for 500px as well, but that doesn’t take anything away from Flickr as far as our use case is concerned. Assuming that Oath (Verizon) wouldn’t bring about any drastic storage policy changes, Flickr offers the best value proposition for free users. The 1000 GB of storage space is unprecedented and the photography focus of the site is much better for image/photo management compared to a paid, storage-only option like OneDrive.

While Flickr has moved some of its tools like the desktop ‘Uploadr’ under the Pro banner, the iOS app is unaffected. It is capable of syncing with the iOS Photo Library and more importantly, uploading the original image to the cloud. It does not however support the HEIF format as is evident when you try to upload these images over the website. On iOS however, the images in the Photo Library are still uploaded after conversion to JPEG. Hence, I have previously recommended the usage of the ‘Most Compatible’ option to prevent unnecessary conversions. Unfortunately, Flickr doesn’t allow the segregation between photos and videos when uploading over a cellular connection and hence I would recommend syncing only over WiFi, unless you have an uncapped cellular connection.

The sidekick to our protagonist Flickr is Google Photos. On its own, Google Photos is an awesome product. However, ‘original quality’ images and videos are limited to the storage available on Google Drive for non-Pixel users, which in most cases is 15 GB. Luckily, Google offers an unlimited ‘High Quality’ option, which one should note, significantly compresses the image. However, thanks to clever use of algorithms and machine learning, the effects are not visible on most devices unless the image is blown up significantly.

As a secondary backup solution, Google Photos offers some distinct advantages. Firstly, it caches lower quality variants of all the images so that the entire library is accessible even when you are offline. Secondly, it offers smaller-sized files on account of the compression and resolution limitations of 16 MP/1080p, which is useful when accessing or sharing something over a cellular connection on social media. Thirdly, it allows photos and videos to be synced separately over WiFi and cellular connections, so that images can be synced immediately while larger videos can be uploaded later over WiFi. Fourthly, once images are backed up, they can be easily deleted from the device (and iCloud) using the ‘Free up space’ option. However, for this, you should ensure that the original images are first uploaded to Flickr. Lastly, the machine-learning powered search is really useful in unearthing hidden images and recreating past memories.

Thus, the combination of Flickr and Google Photos ensures that you have all your images and videos backed up regularly with redundancy and available on demand. While Flickr provides the long-term, original quality storage; Google Photos complements it with smaller-sized, optimized content for on-the-go consumption. It completely cuts off iCloud from the picture and ensures that you more storage available on your device for things that you need and use far more regularly.

Musing #36: The Next Big Thing


I just started reading 'The One Device' the other day and have made it past the first couple of chapters wherein the book briefly touches over Apple's transition to innovation after its lost years. Of course, this is not the first time I have come across the story as the Steve Jobs' biography covers it in much greater detail. However, the underlying message to take away is that well-executed ideas can make a huge difference to the fortunes of a company, even though the innovation may be more evolutionary than revolutionary.

Although the situation is far from similar, reading this phase of Apple's history makes me ponder over the flux the Indian IT industry finds itself in now. If anything, the requirement for innovation in the industry has been expedited. However, what comes around in the public domain sounds more like Orwellian Newspeak. The mention of AI, Automation, Cloud, Digital, Agile in the broadest of terms seems to have little more intention than to placate the shareholders. After all, shareholders in India seem to be a particularly emotional bunch going by the swings that take place after an obvious piece of news is shared by the media. This has necessitated the use of these terms along with others like Big Data, DevOps which have been in circulation for a pretty long time, enough for them to not be considered as part of a novel strategy. Yet, it forms the basis of optimism for a huge industry and its employees.

Ideas need execution to be successful. The basic tenet of the Indian IT industry has been cost arbitrage and providing services for cheap. Unfortunately, the same strategy seems to be permeating itself in the “new” fields. Hence, when the industry speaks of AI, it isn't referring to top of the line machine and deep learning. Instead it alludes to automation of basic operational tasks based on limited algorithmic branching. Even the innovation that does occur in this space is not happening here in India but through talent hired abroad with the usual instruction based implementation being passed on to cheaper coders in India. Similarly, the digital revolution through products and platforms is based on imitating the functionalities of well-established software at a fraction of the price. It is thus a case of simply picking the low hanging fruit.

Establishing any roadmap is based on industry trends and a fair bit of optimism. One certainly must move along with emerging technologies but the success of any buzzword isn't guaranteed. Case in point is that of Virtual Reality. Not until a few years ago, it was seen as the next big thing. Cost has always been attributed as a key factor in the uptake of VR. However, that isn't the case for something like Google Cardboard. It certainly offers a basic experience but at the same time illustrates the fallibility of VR. Beyond the initial novelty of the experience, it becomes very difficult to get people to come back again. One can only take so many rollercoaster rides, scenic walks and museum visits in isolation. Gaming and interactive story telling might be expected to alleviate this but VR has become part of a vicious circle wherein it has been unable to attain critical mass which has in turn kept content creators from investing too much in it. The VR industry is taking recourse by cutting hardware prices for high-end headsets like the Oculus Rift and HTC Vive but unfortunately it seems destined to be niche. As has been the case in the past, mobiles will have to lead the way. However, it seems inevitable that AR experiences as those that will be provided by Apple's ARKit will be the mainstream option for once again it is just a case of incremental innovation.

This brings me back to the Apple and iPhone story. All the pieces of the puzzle were long in existence but none of them were put together in the manner which made the iPhone seem like magic. The next big thing might will not be a revolution but a simple evolution that seems like magic. Being ahead of time is as much as a failure as being late to the party. What one needs is a bridge between the present and the past such that people find the journey to the future much more exciting than the destination itself.

Musing #34: Shifting of personal music collection to Apple Music


Switching back to iOS 10.3.2 from the iOS 11 beta and having to go through my music library to download the tracks once again made me realize that my personal music collection isn't as well sorted as it should be. I had never committed myself to using Apple Music as my one stop music solution, having dabbled across other streaming service providers in the past. However, I finally bit the bullet as I found it to be the most convenient option for accessing my entire collection on the iPhone.

I should add that switching to Apple Music for all your music needs isn't the most seamless thing one can experience in the Apple ecosystem. One can also argue about the audio quality at 256 Kbps AAC, but that is subject to personal preference and doesn't perturb me too much. So, what does the experience of jumping with both feet in to Apple Music entail?

1. Tagging your offline collection

Irrespective of whether you use Apple Music, it is always a great idea to properly tag and organize all your personal music files. In the past, I have used MediaMonkey and MusicBee on the basis of their interfaces and they just about got the job done.

However, I found MusicBrainz Picard to be the best solution when tagging en masse. The scraper managed to match nearly 95% of my collection. For the unmatched ones, you can manually search for similar files or lookup online using the browser. In my case, I couldn't get the web tagging to work seamlessly, but that didn't matter much as the built-in search worked just fine.

The best thing is that you can easily choose the fields to be updated, including the artwork. Manual editing of the most obscure tracks is also pretty straightforward, though it requires additional effort on one's part. The benefit of this exercise is a well organized local collection that is easily accessible and recognizable across devices.

2. To the iCloud the music shall go

iTunes allows you to manually add up to 100,000 of your songs to the iCloud Music Library and access them on any device which accepts your Apple ID. The first thing you should know is that the files are converted to 256 Kbps AAC irrespective of your audio quality, something that may not be entirely desirable. Secondly, the upload process just doesn't work the way it should.

On Windows, I had multiple upload failures which was not easily detectable on iTunes since it didn't explicitly prompt about it. The failures are evident on rummaging through the 'Recently Added' list and finding out the ones with the "exclamation cloud" icons. I managed to re-upload some of these after multiple attempts, even as there was no obvious reason for the upload failure to begin with. Moreover, the synchronization on the iPhone manifested itself only when I toggled the 'iCloud Music Library' option from the Settings. Even then, there were a few files which hadn't uploaded to the cloud, evident by them being greyed out on the iPhone.

This entire process was certainly an exercise in frustration, but unfortunately the worst was yet to come.

3. iTunes Match - When reality doesn't meet expectations

Apple Music matches each uploaded track on the basis of its fingerprint and replaces it with an equivalent track from its catalogue. The point to keep in mind though is that all these replacement files are loaded with DRM that lock the files to your account and subscription. In my case, I have certain old tracks of suspect quality, so I didn't mind this exercise as I had a backup of my original files. Ideally, it is a good solution to ensuring consistent audio quality and would have been acceptable if it worked as it should, except that it doesn't.

The most egregious aspect is that the replacement track, although the same in name, can end up being a vastly different variant. I had many of my edits and remixes being replaced by completely unrelated versions which is infuriating to say the least. This meant combing through the collection once again and finding the right variant. How a 3-minute track can be replaced by a 7-minute extended version when in fact the original 3-minute variant is present on Apple Music is simply beyond me.

As you can determine by the above description, shifting your entire music collection to Apple Music isn't for the faint hearted. Apparently, Apple will allow FLAC to be used as well from the iCloud Drive which would be another disjointed process added to the mix. If you care for your music like I do, you might think it worthwhile to invest a substantial amount of time in this process, otherwise I can't really recommend it.

While accessibility and affordability are prime drivers for subscribing to such services, the flip side is that the more you use it for music discovery, the more you lock yourself in. You limit your exposure to music within the selected ecosystem and there is no easy means to migrate to another service. A time shall come when, for better or for worse,  you shall no longer have an offline music library and you wouldn't have to worry about uploading and conversion at all. But whether it makes your music experience any richer is highly debatable.

Sundry #5: Word Cloud

It is said that a picture is worth a thousand words. In the case of a word cloud, it can't get any more literal. It is always interesting to get a visual representation of words for it paints an altogether different picture (pun intended). It is quite evident that the sheer size of my article on bitcoin dominates the word usage as well, so it will be interesting to see how it pans out as I post more articles. Barring unforeseen circumstances, the latest word cloud should be available at the following link: http://timdream.org/wordcloud/#feed:http://feeds.feedburner.com/smajumdar

Musing #2: Computing by 2062


Ever since Charles Babbage set the proverbial cog in the machine in motion, the juggernaut of computing has steam rolled over the human imagination. One may view a point in time as the launch pad of colossal advancement or the precipice of destruction. History is a great teacher but that would seldom hold true for computing where a spark can ignite an explosion of technological advancement leaving behind well accepted beliefs.

Peering into the next half century of computing is therefore in essence a leap of faith. However, there are lessons learnt over the past half century which lend a view of the path towards the future, though not the destination itself. One irrefutable observation that comes out of it is that computing has become immensely personal over the years to the point where the difference between the human and the machine is only skin deep. The future thus entails that the touch of the skin should no longer be a barrier.

There is no denying the fact that computing has been modelled in part on the human anatomy because humans visualise machines as such. Thus, it is apt that the future of computer processing should involve the amalgamation of an organic mind with an inorganic one. In the future, a synaptic transmission would not be limited to the human mind but would extend to a computer capable of augmenting the human mind. Processing of functions beyond the physical realm would thus be passed on to a more capable mind networked to numerous others dissipating the most pertinent real time information for the human mind to act upon. Information can be visualised without the need for a display device, to the extent that it is meant for one's eyes only. Silicon computing of today would seem quaint in comparison to quantum and probabilistic computing of the future.

Networking begets a means to transmit information and undeniably the future is wireless. Light has proven itself to be the fastest and efficient means of transmission as the fibre optic cables of today would testify. However, future transmissions would have to done wirelessly using the power of photons. Networking would have to instantaneous and lossless to accommodate the flawless transmission of heileybytes of data. Monstrous computations would literally require a new dimension in storage. The future of storage would be in 3-D, be it in the form of DNA or holographic storage.

The mention of wireless brings into focus another one of today's limitation; the need to be constantly wired up to a power source. Necessity would demand that computing should be omnipresent and with necessity being the mother of invention, battery packs of yore would be resigned to history books. While seeds have already been sown in the form of resonant magnetic induction, contactless energy transfer through electromagnetic waves would lead the way in 2062.

It is certainly hard to think of computing as isolated silos in the future and the growing influence of cloud computing indicates that history would repeat itself with centralised computing being once again the way forward. It isn't hard to think of computing as a utility, much like water and electricity today. The all-important question of privacy will easily be addressed with the mind hooked up to the grid all the time. The signature of the mind will certainly be more unique than DNA forensics of today.

Coming to the software aspect of computing, it would be untenable to have multiple platforms working discordantly to achieve the same task. To that end, the operating systems of the future wouldn't be any more than an interface towards accessing the computing hardware. A glimpse of this is already visible in the browsers of today which are able to execute tasks across platforms using JavaScript and HTML5. In this day and age, with the browser becoming the window to the world, it is quite predictable that the future wouldn't be too different.

A key aspect to computing is programming. While it remains a specialised task today, the continual simplification of computing demands that it shouldn't be the case in the future. Gone will be the need to follow syntax with semantics leading the way. Thoughts alone will be capable of mapping out the flow towards attaining the desired result. Consumption as well as creation would be equally accessible.

Thus, it would take a brave person to visualise computing to be not too different 50 years hence and it would take a braver person still to not acknowledge that it would be more personal, simplified and commoditized. Computing would certainly usher in a brave new world by 2062.
A cringe-worthy read it may be but none-the-less it is one I wrote back in 2012 looking at 50 years hence. Can't really say I am prescient for it seems a re-hash of existing ideas.