Bought myself the HTC Flyer a few weeks back. Some co-workers were talking about tablets and this, along with the Samsung Galaxy Note, had been on my short list because of its smaller screen and pen input. My impromptu shopping revealed that Best Buy had just put them on sale for $300 (cf. MSRP $500!). Although the rest of the world also discovered the sale and promptly depleted the inventory of every Best Buy in Georgia, the next day they were selling in the Amazon marketplace for $350. I accepted the precedent set in the case of Snooze v. Lose and paid the $50 lazy tax to re-enter the world of tablet users (I hear these new iPads are popular with the kids). First impressions:
I used it for pen note-taking in the Ruby training class I attended for a week. Handwriting recognition is noticeably absent, but the screen size and responsiveness were good. Perhaps because I'm a lefty I didn't notice any of the lag issues that I'd read about. I haven't yet done any drawing beyond simple diagrams, so maybe it will be more intrusive there.
Thumb-typing is comfortable in portrait mode. I have larger hands so the keyboard size works well. Lisa looked like she had to stretch a bit when working on it, so for some it would not be comfortable to enter a lot of text. I'd read about split keyboard apps that mitigate the problem.
General browsing and app use is done by holding it in my right hand and manipulating the screen with my left. This is one of the great advantages of the smaller form-factor over the more intrusive 10-inch tablets. I notice the weight after more than maybe 30 minutes of browsing, but it's minor.
Much of the use at home has been streaming anime videos from Netflix and internet radio if I'm working or reading. The case I got folds back to view comfortably in landscape. Image and refresh are good.
Ruby's been my go-to language for any medium-trivial scripts that might need to be written. At work, I'd used it for some log file manipulation and for converting CSV files to SQL INSERT scripts, plus minor misc. one-offs. It's benefits are its portability and terse power. Being a Java developer, I'd honestly prefer to have used the almost-identical Groovy language, but it requires the JVM and who wants to always rely on that behemoth (useful as it is)? Passing around a few rb files plus an exe and a dll is just too simple.
This training (for the new job) is covering Ruby proper plus Rails. Much of the framework maps easily onto what is offered either in the JEE Web container and Servlet API or in the JPA/Hibernate API. There's a template library, request filters, URL mapping, and general data persistence and query support. Knowing one definitely helps in learning the other.
Monkey patching/duck punching is de rigueur, so classes are commonly rewritten at runtime as part of the standard framework. You just need to know that a class will look a certain way. Also, interestingly, any number of unintuitive methods will be dynamically added based on a callback named method_missing. This is called whenever a ... method is missing (doy). Frameworks will use it to allow loosely defined methods to be generated at runtime (e.g. by parsing intent from a method name, such as "find_by_name_and_age_and_zip"). All of this tacit interface generation adds to the general confusion when learning. Much is non-explicit in the code.
Readable, English language code is a goal. Ruby's functional roots make this easier to achieve this goal than, say, C++ or any C-derived language (although the Boost library's Spirit parser does an excellent job of using C++ operator overloading to replicate BNF). Because of the emphasis on readable code, Ruby library developers will talk about creating DSLs as opposed to simply APIs. This seems like an odd affectation.
The functional bits are familiar enough if you know C++'s function objects or Java's anonymous inner classes. Doing these tasks in Ruby is, however, much cleaner. (Notably, C++11's lambda functions offer promise, as does Java8's closures.)
/Film has a review of the preview of The Dark Knight Rises. In it, the reviewer emphasizes the importance of seeing this in IMAX: I can't even imagine watching the film in digital or 35mm, missing out on much of the epic scope.
I have mixed feelings about such technical requirements.
There're various similar areas of concern of authenticity and experience when watching a film. Some directors and critics insist that a film be watched in the theater. I follow that rule only when there's awe to be gained (Tree of Life, LotR) but that's just a preference. The (now-dead?) practice of colorization started a whole religous war. It feels gimmicky and can be done well, however there are important aesthetic arguments against colorization. Framing and emphasis in black and white is achieved with tonal and textural contrast. At it's simplest: a lighter figure standing against a dark background will draw attention to the figure. When color is added, color intensity and palette relationships--as opposed to contrast--begin to define what the viewers' eyes are drawn to. The director's choices are being overruled by the colorization. A newer technical requirement is 3D. Just as B&W encodes certain artistic choices that color could alter, 3D encodes choices that 2D could alter. A spear thrust at the camera is an unimpressive circle because of 2D perspective. It has a notably different affect in 3D. Similarly, but more born of necessity, subtitles and dubbing can alter a film. In my opinion, the lesser altering of the two is subtitles, since you still retain the original actors' vocal tone and prosody. Beyond these outside influences on a film, there are alterations that may be made by the creator. A director or studio may release different edits to a film. Blade Runner, notoriously, has seven different commercial versions available. This is common with any art form. Multiple versions exist of many symphonies and different painters have stated that a specific work is never finished. Which one should be considered canonical? And then there's the true religious war that George Lucas started with his digitally inserted aliens and gun-shy Han Solo.
So, although many choices made by the artist are part of the work's expression, I wonder whether the higher resolution of IMAX isn't just a luxury of venue.
I'd been going to interviews over the past month and the singleton is still the most popular point of discussion. I would try to bring up other patters (let's discuss the visitor!) but to no avail. It's popular beyond its usefullness, and so I just hate discussions on it. However, it did force me to dredge up memories of discussions on the difficulties in managing the lifetime of a singleton in C++. The links: