Musing #75: Analysing the Valencia Formula E debacle

 


Some may call it exciting and others farcical. For a fledging series, Formula E certainly attracted the wrong kind of attention with the slipshod finish yesterday resulting in only 9 drivers being classified at the end. While it is easy to call out the FIA or Da Costa for the extra lap, the reality, as always, is more complex. This analysis thus aims to clarify the events as they unfolded.

The first big question that I came across on the web was about the big change in energy reduction percentage during the various SC periods. While there was only a 3% reduction during the 3-minute safety car (SC) at the 20-minute mark, the final 5-minute SC resulted in a 12% reduction. Well, this is quite easy to explain keeping in mind the starting available capacity of 52 kWh. Essentially, the percentage value displayed on the screen is a relative value whereas the absolute reduction is happening in terms of kWh, both to the usable energy as well as available energy.

At 20:38 remaining, we can see that the available energy is 61%.

Moments later, it drops to 58% after the reduction.


The reduction itself is 3 kWh for that SC period and a total of 9 kWh for the race.


This latter part is the most important as it indicates the total available energy after the reduction. Back of the envelope calculations for this scenario is as follows. Note that these calculations are based on the whole number figures displayed in the TV graphics while the actual numbers with the correct precision would be slightly different.

Usable energy before reduction: 46 kWh (52 - 6)
Available energy before reduction: 61% of 46 kWh = ~28 kWh

Usable energy after reduction: 43 kWh (52 - 9)
Available energy after reduction: 58% of 43 kWh = ~25 kWh (28 - 3)

Now let us take a look at the final SC. The available energy is 18% prior to reduction.

It drops to 6% after reduction.

The total energy reduction is 19 kWh in total and 5 kWh for that SC period.

The calculations now are as follows:

Usable energy before reduction: 38 kWh (52 - 14)
Available energy before reduction: 18% of 38 kWh = ~7 kWh

Usable energy after reduction: 33 kWh (52 - 19)
Available energy after reduction: 6% of 33 kWh = ~2 kWh (7 - 5)

As you can see above, the percentage value can be quite confusing for the viewer as both the numerator and the denominator change by the same amount and thus the change in the actual percentage value will be more drastic for lower energy values than the higher ones.

That explains the TV graphics but then why were the teams caught so unawares towards the end? For, let us go to the period just as the final SC came out.


At this point, Da Costa has 22% of 38 kWh usable energy i.e. 8.4 kWh. The fastest lap as can be seen in the office notice was about 1m 40s. It thus indicates that at this point, there was enough time to cover 5 laps.

However, when the SC came out at 5:38 remaining, there was still a part of the lap remaining. Luckily, Da Costa floored the car at about the same place after the SC, so we can easily make out the time needed to reach the finish line from that part of the track.


As can be seen from the images above, it takes about 25 seconds implying that Da Costa would have crossed the line with 5:13 remaining if there was no SC. Considering the fastest lap of the race, 3 laps would have taken around 5 minutes. Thus, if Da Costa was to complete 3 more laps in 1:44 to 1:45 minutes, he would have needed to complete only 4 laps to finish the race with about 8 kWh remaining which is perfectly feasible.

The problem then is that the SC pace was not enough to scrub off a lap and thus the cars still had to complete 4 laps to finish the race, around 2 laps under the SC and a little more than 2 laps at race pace. While the race pace target was 2 kWh/lap, under the SC, assuming a lap time of 2m 30s, the reduction would have been 2.5 kWh/lap. This implies that the cars lost around a kWh of energy behind the safety car due to the time elapsed and at the same time utilised closed to 1.5 kWh of energy following the SC.

Thus, while in race conditions, Da Costa was expected to have 4.4 kWh of energy at the point where he started the final run with 2 laps to go, in reality he had about 2 kWh. The only feasible option was to limit the race to one lap after the SC which he was unable to do so. At the same time, Mercedes seems to have a 5-lap target in mind before the SC and thus were keeping more energy in hand compared to Da Costa who was hoping to limit the race to 4 more laps at the time the final SC came out.

While it was a shambolic end to the race with the FIA shifting blame to Da Costa for not controlling the pace, the FIA is not without blame. They had never made a provision for such a scenario and the fixed reduction of 1 kWh/min applied by the FIA is excessive when the usable energy is low.

One way of tackling such a scenario could have been to reduce the energy allocation for a race lap over a SC lap duration (i.e. ~2 kWh reduction over a 2:30m SC lap). This results in a reduction rate of 0.8 kWh/minute. If this seems too low a reduction, then the rule can be changed to apply this limit only for the final 10 minutes of the race.

The other option could have been to not deduct the energy consumed by the car behind the SC which seems to be about 0.3 kWh/minute (can be calculated from the fact that usable energy reduced from 22% to 18% behind the 5-minute SC with 38 kWh available energy). However, this would result in a reduction rate of 0.7 kWh/minute and thus even more benevolent that the previous approach which is in fact more practical as it takes in to account the energy consumption by the race car and SC for a specific track.

Will this situation be addressed? That is anybody's guess as Formula E certainly seems to be quite disorganised at present. However, the solution to the problem is available as highlighted above and all it needs is the FIA to act on it. Most probably though, I think the teams will adjust the software to account for the SC loss going forward and we might see a slow but secure finish if such a scenario arises in the future.

Musing #74: Designing Fossil Hybrid HR Watch Faces


  

Time pieces move on with time and the Fossil Hybrid HR happens to be the latest one occupying my wrist. I always have the urge to customize whatever I can get hold of and the McWatchFace was the first in this regard concerning a watch face.

Following that, I moved on to the Fitbit Versa 2 but I could never invest enough time to create a watch face for it.  I had the yearning for a more traditional watch, but ever since I started using fitness trackers, I can't completely wean myself off it. To that end, I got hold of the Fossil Hybrid HR a fortnight ago.

Having never had a Pebble in its heyday, this watch has been a revelation for me in terms of tinkering as it doesn't require much to design something new and share it with the world at large. With that, I present my first watch faces for the Hybrid HR. If you already possess this watch, then feel free to access these watch faces using the Fossil store and let me know if you have any ideas worth implementing.

6 B&W










Tutorial #25: TWEAKING THE ASUS TUF GAMING A15 - Part Deux

 



My previous post about the A15 was about plucking the low hanging fruit of performance. However, there is always the scope of optimising the settings further to gain the greatest benefit for the lowest cost. That is what I was up to on and off since the last post and having reached a satisfactory result, I have decided to share the same for anyone trying to squeeze the little bit extra from this hardware.

The base concept is still the same, to get more out of the GPU at the expense of CPU within the permitted power and thermal budget. To that end, I went through the process as follows:

1. Reduced the CPU temperature and Normal/Short/Long TDP limits to 85 and 25/35/45 respectively in Ryzen controller to provide further headroom to the GPU.

2. Reset the MSI Afterburner settings to stock which had the following curve for my RTX2060.

3. Ran an actual game as I would like to play it (in this case Dishonored 2 at 2K Ultra with HBAO and Triple Buffer) and noted the Average and Maximum GPU frequencies attained in the middle of the game.

4. Few observations first. Neither the CPU or GPU are thermally throttled in any way. Instead, the GPU hits the power limit which in the case of the A15 is 90W. Note that the included GPU is the RTX 2060 Notebook Refresh and thus it is a 110W TGP part. This indicates that the laptop does have thermal capacity to spare, especially as I had conducted these test with an ambient temperature close to mid-30 degree Celsius. Having the option to push the GPU Power further would have been great but with that being an impossibility with a locked BIOS, the next step was to figure out how to extract the most from the hand I have been dealt with.

To that end, I noted the frequencies which yielded the sustained performance (1560 MHz @812 mV) and the peak performance (1755 Mhz @918 mV) in-game.


This concluded the stock performance analysis. Now, there might be multiple guides present that put forth different suggestions as to how you can proceed with undervolting or overclocking but I decided to use these figures to try to set a target that I wanted to attain. In this case, it was to try to push the stock sustained performance to the lowest voltage (i.e. 1560@700) and the peak stock performance to the sustained voltage (i.e. 1755@812). Doing so manually with a smooth curve was going to be quite a challenge, so I decided to take a bit of a shortcut in attaining this objective as follows:

5. Executed the Nvidia OC Scanner within MSI Afterburner to produce an OC curve. The curves are not always the same, so I executed it a few times, also at slightly different CPU TDPs to come up with the curve that resulted in the highest boost frequency. In this case, it was as indicated below.


If I look at the frequencies at the concerned voltages, then it is 1515@700 and 1755@812. Thus, it seems I have almost attained the target I set out for without doing much.

 
Taking a look at the HWInfo figures again with the OC curve, it can be seen now that the sustained frequency has jumped to 1725 MHz from 1560 Mhz which is a decent OC. Also, the peak frequency now is at 2040 MHz which is an even bigger leap but it comes at a much higher voltage (1006 mV). The effect of this however is that the GPU is now hitting all the performance limits apart from the thermal one.

Almost there, but "almost" is not good enough, so I had to push it a bit further.


6. At this point, I decided to try to move the curve to the left, in effect overclocking the OC curve even further to see how much more performance can be extracted from it. I started by essentially shifting the curve to the left by 25 mV but as soon after I started encountering artifacts within the game indicating that I had pushed it a bit too far. As a result, I shifted the curve by 12.5 mV instead and found it to be perfectly stable. 

The other change I did was to flatten the curve at the half-way mark of the complete voltage range which is at 975 mV. There are various reasons to do so, primary of which was that the GPU never really reaches the frequency associated with that voltage and if it does as stated in the point above, it is for a fraction of a second. Consequently, it also saves the effort of manually adjusting the curve in futility. An argument could be made that the curve can be flattened even earlier to essentially attain an undervolt but I wanted to allow the GPU to boost to its practical maximum as much as possible.

With the above, after smoothing out the double frequency jumps (15 Mhz instead of 30 Mhz for a single increment in voltage step), I was left with the curve indicated below.


It starts at 1560@700 (surprise!), reaches 1755@800 and peaks at 2010@975. So how does this curve now fare within the game?


Firstly, we are back to only hitting the power and utilisation limit. The sustained in-game frequency is now 1755 Mhz, a further 30 Mhz boost from the default OC Scanner curve. The peak frequency is now 1965 Mhz though as against 2040 MHz earlier, but as I mentioned previously, it is transient and if you look at the average GPU power, it has come down to 69.4W compared to 70.3W for the OC scanner curve and 71.3W for the default curve. Amusingly, the maximum power consumption was over 97W with the stock curve and I also observed it breaching the 100W barrier in an intermediate test. May be it is due to some quirk in HWInfo or otherwise, the card is indeed capable of going over its locked TDP of 90W in some cases, though without much benefit.

7. With the GPU OC sorted, next, I wanted to see if I can push the CPU a bit more in co-ordination with this curve. You will have to take my word for it, but I tried increments and decrements for all the TDP values while keeping the temperature limit at 85 degrees Celsius and I finally found the best performance at Normal/Long/Short TDP of 25/40/50 respectively.


The proof, of course, is in the pudding. Thus, I present to you now, the comparison between the stock performance and after the CPU/GPU tweak. The duration of the HWInfo figures spanned from the launch of the Time Spy test to the calculation of the score.

Stock:

Post tweak:

A good jump and also a slightly higher score than the tweak in my last article (6703). What it doesn't indicate though is that the power consumption is lower than last time.

8. One last thing! I didn't at any point mention anything about the GPU memory overclock because I kept it for the last. After trying out different increments, I settled for a boost of 500 MHz as it was stable and didn't lead to any noticeable increment in power consumption and thermals. With that, here is the final result.


To put things in perspective then, this is how the tweak stacks up against the stock setting.








Review #65: Asus TUF Gaming A15 Laptop (Ryzen 7 4800H | RTX 2060) ★★★★✭

 Team Red + Team Green - A killer combination!


Introduction:

Ever since Y2K, when AMD stole the limelight for a bit with breaking the 1 GHz barrier and releasing AMD64, AMD as a company failed to impress on me the need to purchase their products. I had opted for Intel just prior to the Athlon breakthrough and every upgrade cycle of 4-5 years led me to opt for Intel. Hence, I was simply enthralled at switching to Team Red after nearly two decades of being stuck with Team Blue. My GPU always has been Team Green but with the integrated Vega 7, there is a dash of Red over there as well.

The Choice:

During the holiday sale 2020, it was between this and the Acer Predator Helios 300 for the princely sum of "not quite" one lakh INR. I could see the reviews racking up for the Core i7 variant on Flipkart and I had even purchased the same but cancelled it as soon as I came across this Renoir masterpiece. It helped that Amazon also offered a much higher exchange price for an old laptop that was lying around, compared to Flipkart.

To put it straight, the Helios 300 has only one thing going for it compared to this one and that is the screen. On the flip side, this comes with a monster CPU, DDR4-3200, a 2000+ Mbps 1 TB SSD from Western Digital, a large 90 Wh battery, lighter weight, higher travel keyboard and about as good a cooling solution as the Helios. It also looks more professional than the Helios, so you can use it in formal environments without having people snickering at you. So overall, it is a win for the A15 over the Helios 300.

Display:

To address the elephant in the room, Asus gimped on the screen, using a Panda panel that has only about 65% sRGB colour gamut and >20 ms response time with quite some screen flex. It pales (no pun intended) in comparison to the 90% sRGB panel with 3ms response time on the Helios, but that is about it. I still managed to get popping colours out of it by increasing the saturation on Radeon Software and calibrating the display from within Windows. Sure, it throws accuracy out of the window in favour of something eye-pleasing but I am not looking to do any colour-work on it and even otherwise, I am looking to connect it to my 120 Hz 4K TV at home for gaming. I am unsure about it, but with the HDMI or DisplayPort output being driven by Vega, it should also support FreeSync directly compared to laptops having output routed through the Intel GPU.

Hardware:


The primary reason for getting this laptop is the Ryzen 4000 series. The 4800H puts the Core i7 to shame. I ran Cinebench after updating the system and without any tweaks. It registered nearly 500 on the single core and 4386 on the multi core, that even the Core i9-9980HK can't touch in most laptops, due to it being a blast furnace rather than a processor. The 4800H did not even touch 80 degrees on the Cinebench multi-core test. It did go past 90 on Firestrike but it never thermally throttled whereas the under-volted 9980HK in my earlier laptop hit 100 degrees within seconds and throttled like it was being asphyxiated.

The RTX 2060 is also the 2020 "refresh" variant with the 1.25V GDDR6 and higher TDP. It passed 15,000 on Firestrike on the first run but with the CPU running much cooler, it opens up the possibility of over-clocking the GPU farther than you can on an Intel machine.

Among other points, the machine ran without much noise on the benchmarks, but I expect it to reach whirring heights with demanding games, something that is to be expected of most gaming laptops. I haven’t checked the battery life and probably never will over the life of the laptop, as I always used it plugged, but the 90 Wh battery with the 4800H will provide a longer battery life than any Intel gaming laptop. The lonely USB 2.0 port on the right-hand side is a bit of a let-down but I have my fingerprint reader permanently plugged in so that I can use Windows Hello. Not having TB3 is also disappointing but I can’t see myself needing it over the lifespan of the laptop as DLSS will most probably help with higher resolutions in the near future.

Tweaking:

As expected, the UEFI on the laptop is barebone. AMD also doesn't support Ryzen Master on laptops, leaving it to OEMs to decide on the thermal envelope. That leaves Ryzen Controller as a tool of choice as it has experimental 4000-series support but with it currently being limited to STAPM settings, it is more likely to be needed to extract more performance rather than to lower temperatures, and thus is not the need of the hour.

However, as I mentioned previously, there is light at the end of the tunnel in terms of extracting more performance from the GPU. As the following 3DMark screenshots indicate, the GPU is able to provide 6-7% more performance using Auto-Overclock at the loss of less than 1% CPU performance. The GPU temperatures too are similar, though the CPU temperature does go up by 4-5 degrees at idle and 2-3% degrees at full load, but still does not throttle.

Warranty:

The unit received from Amazon was manufactured just 2 weeks before as per the warranty registration date. It can be changed to the invoice date by providing Asus with the invoice and a photo of the laptop serial number. An additional year of warranty, after using the 10% off code provided with the laptop, costs about $35 which is quite respectable.

Conclusion:

To sum it up, at the sale price, you can only go wrong with a gaming laptop if you choose Intel. Asus got most things right apart from the screen which is gut-wrenching but not a deal breaker, especially if you use a monitor or TV. In this case, it is what’s inside that counts and this thing is as TUF as it gets.

P.S.: It comes with a huge 16A plug that would probably go well with a microwave in the kitchen. Thankfully, the power adapter has a standard connector as a desktop PSU, so I was able to connect a 16A cable with the regular sized plug. You can also probably get away with a lower amperage cable but it is best to get a 16A one if you can.

Musing #73: Raspberry Pi 4B - SD Card vs SSD



Earlier this year, I bid finally bid adieu to my Raspberry Pi 2 in favour of the Raspberry Pi 4B. The RPi 4B certainly opens up new horizons with its additional power, though I don't suppose it is a desktop replacement as some marketing material would have you believe. I couldn't care less about that aspect as it is meant to be more of a hobbyist product, though I did switch over to MATE desktop environment from the kiddish-looking LXDE environment that the Raspberry Pi OS comes with by default.

What I was looking for most is the general increase in performance and with the BCM2836 SoC in the Pi 2 v1.1 not supporting USB boot, this was the first time I could boot off the USB. I immediately jumped on to the 64-bit beta back in May along with the EEPROM update that allowed booting off a USB drive.

I already had a Sandisk X110 M.2 SSD with me from an older tablet and a M.2 to USB enclosure. Unfortunately, I quickly realised that the enclosure was not up to the task as even loading the boot files failed repeatedly. It seems the controller on a cheap enclosure isn't really that good (who would have thought?), so it meant getting another one instead. I went with a ORICO one this time, not expecting it to be great but at least better than the one I had since it cost 3 times as much. Sure enough, it did the job.

So how fast is the SSD over the SD card? Unsurprisingly, it is quite significant. The lower power of the processor on the Pi gives a better idea of the difference made by SSD alone, though with it being limited to a shared 5 Gbps interface, the full extent wouldn't be evident if you plug in another USB 3 device or wastefully plug in a NVMe SSD instead of a NGFF one.

Long story short, unless you need your Pi to occupy as much less space as possible, it makes sense to boot off a SSD instead of a SD card. Also, if you have a good case, like the Argon One I picked up recently, it is possible to overclock it to the maximum 2.147 GHz without voiding the warranty and still keep the temperatures lower than the stock frequency without cooling. All this does tempt me to give the Pi a go as a daily driver, but for now there are many other creative uses for the device that take precedence. Until next time, Godspeed!

Review #64: Samsung Galaxy Buds+ (4-month review) ★★★★✬



I left my previous post in a bit of a cliff-hanger but then things have changed a lot since then. One would imagine being stuck at home would offer better opportunities to engage in one's passion but quite the opposite turns out to be true. 2020 hasn't been an easy ride and no one could have seen what's coming, but that's the story of life, our life.

To pick up from where I left off nearly 4 months ago, I did pick up an alternative in the week following my previous post and the choice is reflected in the title of the post. You might recollect that it was a balance between price and quality for me and in that essence, the Buds+ hit it out of the park, provided you pick it up at the right price.

While even the renewed Jabra Elite 75t was priced at 10.3k INR ($138), I picked up the brand-new Buds+ at 8.5k INR ($114) and it is now priced even lower at 8.2k INR ($110). Granted you will have to find the means to pick it up from Samsung's corporate portal rather than the consumer one, but at that price, you can easily see why it makes a really compelling option. It is rare to have electronic items priced lower in India than in US, so it is good on Samsung to offer it at such a competitive price, albeit hidden from most consumers.


I picked up the blue variant simply on account of it not existing in the previous version. I am not particularly picky about colours, but this shade turns out to be quite "cool". There are new colour variants being released all the time, so you may have a personal preference but at release, this one was the only option if you didn't want to go with the non-colour black and white options.

The packaging is pretty standard by now for most true wireless earbuds but Samsung gets most of it right, starting with the USB Type-C support for the case. Speaking of the case, it is much smaller than what you might get with competitors and light at about 39g which was possible simply because Samsung managed to pack incredible battery life within the earbuds itself as against having multiple recharges provided by the case. 


There are 3 sizes of tips provided along with hooks and a Type-C cable. I had to go with the largest tips eventually to get a good fit and passive isolation, but it gets the job done. It may support one of the Comply foam tips if you prefer those, but I couldn't use my MA650 Wireless tips even though they were a much better fit, for the reason that the buds wouldn't fit in the case with those attached. A real bummer! Apart from the tips, the wingtips offer the extension required to lock the buds in place. I can see this to be a godsend for some people but it never worked for me. I get a snug fit in my right ear and a loose one in my left which irritates me to no end, but I guess you can't change your ears to suit devices and I wouldn't like to know about it if it's possible. Thankfully, the buds themselves are quite light at little over 6g as otherwise it would have been a hard time walking or running with them.


Going back to the point on battery life, the Buds+ boasts 11 hours of device battery life. That is a tall claim and one that I am inclined to believe based on anecdotal evidence as it is nigh impossible to have a 11-hour listening session. However, I went through a complete workday having the buds in ear or lying about and finished the day with 55% battery life with office calls and some music thrown in. The battery capacity figures are indicated above, and basic maths would indicate that the case offers a bit over a single charge, hence Samsung's claims of 22 hours listening time in total. 


The case itself has a multi-coloured charging indicator inside for the buds on the inside and the case on the outside which does a good job of indicating if the buds are being charged as well as the battery life of the case itself going from green to orange to red. Wireless charging support would also come in handy in case of emergencies if your phone happens to support the same which sadly isn't the case for my 7T.


One thing that has been consistent is Samsung's rate of updating the software which is good to see. The above screenshot on the left indicates the first update I downloaded straight out of the box and the second one indicates the latest update which happens to be the fourth one in 3 months since purchase, so a decent clip. A lot of the initial updates were focused on ambient noise and the latest ones have moved more towards stability. Even so, features have been added with time and the latest one is the seamless device connection, or at least the option to toggle it off which would come in handy when devices are fighting to take control over your buds, as is the case with Windows.


The "Labs" section is another one to access experimental features that Samsung feels is not ready for prime time. However, I found the edge double to be most useful, not for taping on the edge for volume control but rather the base of my ear and it works surprisingly well. The detection is done using the accelerometer, so it doesn't matter how you activate it. This gives rise to the possibility that some people might activate it by sudden ear movements, but it has been pretty flawless and convenient for me.



Continuing with the app interface, the above image is of the main page of the app and it gives an overview of all the available settings. The most visible change has been to the battery indicator where I have observed the battery life indicator being switched from displaying the individual level to a combined one. It is obvious that both buds may not have the same life based on connectivity and usage, so the individual bud display was better in that sense but I am pretty sure that a lot of people would have complained about the asymmetrical battery life as being a device issue and hence now we are probably looking at lower of the two battery lives which limits information as far as single bud usage is concerned. Unlike the 75t which uses a master-slave (leader-follower?) combination, the Buds+ is capable of being used independently and hence it is odder still that Samsung moved to a combined battery life indicator.

Apart from that there is a simple 6 preset equaliser present and I would have instead preferred at least a 5-band equaliser that is provided by Jabra. This limits the tweaking ability and I would assume a lot of people would go for the Bass boost option because this set is far from being as bass heavy as the Jabra Elite 75t. The other options are unlikely to be used much apart from probably the Touchpad one.


One might have expected more customization from a section dedicated to the Touchpad but only the touch and hold option is customizable out of which 'Ambient sound' is a must-have. Rest of the controls are pretty intuitive and doesn't take much time to getting used to. The Lock touchpad comes in handy when dozing off and I admit to making use of it a couple of times to good effect. Overall, having a capacitive touchpad is better than having to press physical buttons and +1 (see what I did there?) to Samsung for that.

I believe I have covered everything apart from the audio until now and a lot of people would chastise me for beating around the bush. However, sometimes it is best to keep the best for the last. To prevent any confusion, I am not talking about the audio being the best in its category but rather the best aspect of the device itself. It really holds up well for what it is. By that, I would like to clarify that it isn't at the same level as the Jabra Elite 75t but close to it. It can't punch bass to the same extent as the 75t and it has a smaller soundstage but otherwise the clarity is quite good. I am putting this in perspective of my use case which is using this on the move and in such instances, the higher audio quality doesn't matter much as I would altogether put down wireless buds if I am to enjoy the audio. Also, Samsung has significantly improved the microphone quality from its previous iteration by including 3 sets of it and it also does a good job when using the 'Ambient Noise' feature which I believe is a must-have for any TWS earbuds. On the flip side, the microphones are too sensitive and pick up the ambience to a great extent which is a shame as the passive isolation from the earbuds is quite good and the wearer is oblivious to the noise others complain about, unless 'Ambient Noise' is enabled in calls and even then you cannot do anything about it rather than apologise to the listener.

To address the elephant in the room, the Buds+ don't have any kind of aptX support. You will have to rely on AAC for most devices and that isn't a great option for Android. While the audio quality is still decent, the latency is atrocious for apps that are not tuned to synchronise the video as per the latency. Also, Windows does not support AAC and it means falling back on SBC which makes things even worse. Samsung's variable audio codec might be a good alternative to aptX but with it being limited to only Samsung devices, it isn't going to be a smooth ride for those who wish to use the Buds+ for everything. However, my use cases mainly involve music and watching video on apps that are designed to synchronise the video with the audio, so it hasn't been much of an issue. Also, while Spotify might sound poor with AAC and SBC, my music collection is mainly in FLAC and the AAC stream of it gets the job done when on the move.

To round it off, getting 90-95% of the performance of the 75t at just above 50% of the cost is too good to pass. You will lose the water and dust protection with the Buds+ only being classified as IPX2 and the audio quality is again a notch down from the 75t, but something that isn't going to be an issue when on the move. The battery life is phenomenal, and the look, feel and fit are much better than the 75t. It boils down to your use case and if it is about having a great set of wireless buds on the move, then this fits the bill perfectly. If you are someone transfixed with audio quality and active noise cancellation, this one isn't going to float your boat. But for the value conscious, there simply isn't a better option from a reputed company that cares to update its device beyond the initial purchase.

Musing #72: R.I.P. A50

Over the past few months, I had made multiple posts on the Samsung Galaxy A50, be it a short review, initial analysis of the super slow-mo or a guide about making the most of the mid-range hardware. Unfortunately, all average (good-ish?) things come to an end and in this case in ended with my A50 being lost. The driver of the vehicle in which the phone was left behind gave me some hope in picking up my call, but what followed left me with a little less faith in humanity.

However, life goes on and move on I have. At the same time, I have no emotional attachment to any material possession, so this post is not an eulogy on the A50 but rather a short post on what can be done to make the most of the situation where the phone is lost.

Samsung puts a fair amount of bloatware on its phones but one piece of software that is genuinely useful is "Find My Mobile". This feature is markedly better than what is offered by Google and there are several options for dealing with the lost device besides simply tracking it like erasing the device, ringing it, retrieving recent calls/messages and extending the battery life. Unfortunately, my trust in the driver led me to not immediately open the tracker which in turn ensured that the device was never again switched on with my account activated.


With the horse having bolted from the barn, there can be some solace found in rendering the device useless, well, as a phone at least. The Department of Telecommunications (DoT) in India launched the Central Equipment Identity Register (CEIR) earlier this month which is supposed to make the blocking of the lost phone as easy as snapping of fingers.

Unfortunately, as with any government initiative, things sound much better on paper and on websites than in reality. I went through the process of lodging a police complaint at the place where the phone was lost with the expectation of making the most of this lifeline afforded by the DoT in terms of being able to take some action on the lost device. As it turns out, while the website correctly verifies the IMEI using the dedicated tool, the form itself fails to submit with the error stating the absence of data for the IMEI. A really shoddy implementation by C-DOT backed by an equally appalling lack of response on social media. I would still give them the benefit of the doubt considering it has been launched as a pilot project, but hope they would be inclined to fix the website eventually.

Even every misfortune is worth the experience and I would say this is a lesson well learnt. A bit more practicality over trust in humanity might have saved the day. Hopefully, this post would equip you to better handle such a scenario in a far better manner than I did. See you until my next mobile adventure.