Just bought my first iPhone (iPhone 6 plus) after years of Android.
Not clear if I am delighted yet.
Scheduled some time with a stylist …
Have been looking at options for cloud storage at the 1TB capacity limit.
Google offers 1TB for 9.99$ a month.
MSFT offers 1TB for 69$
a month <ooops> a year! with MS office apps and 1TB per user for 5 users for 100$
Dropbox offers 500GB f0r
99$ <000ops> it’s actually 499$ a year
For lots of reasons this past week I decided to look into books on the C programming language.
There are very few books that have been published in the last 10 years on C.
You’d think that given the amount of code that is still being written, the amount of code that must be supported, new books would exist.
Heck, you would expect web pages to exist.
Or I am just doing my web searches wrong.
The challenge with C, if you’re an author of a book, is that the language is pretty simple. The libraries are also pretty simple. The complexity of the language is that, unlike almost every other language out there, C does very little to obscure or hide the underlying hardware. To program in C is to program, for better or worse, directly on the underlying hardware.
Hardware doesn’t have garbage collection, memory hierarchies exist, CPU’s have error handlers and has registers that need to be carefully programmed. Hardware has errata that makes your code break in weird ways.
There is a temptation to write a book about the C language that quickly turns into an apologia for the limitations of the language definition instead of an exploration of how and why it’s used and the value it brings.
Most texts and books that exist for other programming languages advocate a style of programming that tries to create a nerfed environment that hides the complexity of hardware. The theory of those authors and language being that the physical reality of hardware gets in the way of creating magical software that only the Turning Tar Pit constrains. Heck, Apple just released such a language called Swift to get away from Objective-C because — in many ways — it didn’t abstract the hardware enough.
There is a book crying out to be written about how to program to the hardware-software interface. A book that demystifies a lot of what I have learned through painful, bloody and miserable training.
If someone has a good book, just drop me a comment.