I got into a somewhat unique situation few weeks ago: I had a JVM heap dump 16GB in size that I had to analyze using a memory profiler. It was obviously hopeless to analyze it on my laptop which only has 4GB of RAM. Fortunately, I was given access to a computer half the world away that had 64GB of RAM. What I did was I moved my YourKit profiler instance onto that machine and tried to run it using X.11 tunneled through SSH.
Thursday, September 09, 2010
Running YourKit UI remotely
Monday, June 28, 2010
I work for Twitter now
Monday, February 22, 2010
Windows Live nightmare
Last week I purchased a copy of Gears of War - it cost me some 12 bucks in local currency, one of the benefits of only buying 2-3 year old games. Yup, I'm the guy from this comic. So it happened that this weekend I decided to play a bit with it, so went to install it on a Sunday afternoon.
As installs go, it was uneventful, and I got to launching the game.
Monday, November 16, 2009
November moment
Oh, the sudden urgency in the people’s steps in the street as the raindrops start hitting the pavement. I don’t have an umbrella, so I’m drawing close to the buildings on the way home for some little shelter their walls and balconies can provide. I thought these buildings familiar, but either because I’m passing so close to them, or because of the rain, they smell different. It’s surprising, but pleasant and triggers some previously deeply buried childhood memories.
Tuesday, September 08, 2009
Review of the Commodore 64 emulator for the iPhone
UPDATE: Apple yanked the app because of the BASIC interpreter hole, and the developers have plugged it and resubmitted the app. Way to go, Apple. I mean, what harm could that BASIC interpreter do? It has no means of loading external code - no access to local filesystem of the underlying OS, no network connectivity, nothing. Are they afraid I'll manually type-in a program from a listing published in a magazine or something? I got tons more sense of childlike wonder from toying with the interpreter than I could ever from playing Jack Attack and Dragon's Den combined! Sheesh... On to the original review:
Against better judgment, I ended up taking my iPhone with me to the bathtub yesterday evening. I needed to check something really urgent, like the IMDB rating of "Watchmen" or similar (just finished the book). You know, stuff that you risk water damage to your high-tech gadget. I ended up browsing my RSS feeds instead though and spotted news about Commodore 64 emulator being approved by Apple and available on the App Store.
Now, you need to know that the first home computer I've ever been exposed to was a Commodore 64, at around age of eleven. I've got an enormous amount of emotional attachment to it, being enthusiast hacker for it as a kid. So, I was sitting there in my bathtub thinking, "this would be nice to get" and then realized "wait a minute, I can get it; right here, right now."
Let's pause for a moment to think about it folks, just how big enabler Internet is in our lives. I mean, there I was, purchasing a Commodore 64 simulator from my bathtub, and a minute later, I launched it and started toying around with the UI.
That's where disappointment started to set in.
Oh, don't get me wrong. Developers of this application (and the excellent open source Frodo C64 simulator they based it on) deserve every praise possible. I've used Frodo several years earlier, and it is painstakingly precise reproduction of a Commodore 64, up to the point of hardware side effects. The iPhone application is tasteful, snappy, responsive, and the UI is okay.
I do actually have one gripe with the UI, though. See, Commodore 64 horizontal resolution is 320 pixels, exactly the same as that of the iPhone. Since the developers added a graphic around it resembling the venerable Commodore 1701 monitor, the mask obscures some of the pixels at the edge. This is very annoying actually, since lots of games display important information around the top and bottom edges. These machines back then emitted an image that had two distinct areas named "border" and "paper". Border was the outermost part of the image and had a single attribute: color. Paper was a smaller rectangle within the border where the image was actually displayed. The purpose of the "border" was specifically to act as a disposable margin that could be clipped to size by your monitor (in many cases, a TV screen) so that you don't lose paper pixels. The developers of the iPhone emulator thus improved aesthetics at expense of functionality.
But that's not what I'm disappointed with.
Honestly, I could be disappointed by the initial selection of games; they're all ancient simplistic titles even by Commodore 64 standards, old 1983 titles where developers still didn't know how to utilize the platform to its potential. There will be more games available later, so I'm not too worried about that. (I hope they will release Elite...)
That's not what I'm disappointed with.
What I'm disappointed with is the lack of the Commodore 64 essence, as I experienced it. That essence, ladies and gentlemen, is hacking. Even as a kid, I disliked it when people were using the machine exclusively as a game console. As a bit of personal history, let me tell you that my family was not exactly wealthy when I was a kid, and that too is an understatement. After I became interested in computers (following a series of articles in a monthly magazine for school kids), I would write my little programs (initially BASIC, later 6510 machine code) in a notebook, then annoying one of the three people in our village who had a machine to give me a bit of a time to try them out. I saw the potential for creating my own stuff in these machines, and couldn't imagine why would someone, who is fortunate enough to actually be able to afford one, use it only to play games on it. I disliked people who didn't see beyond gaming (I'm not saying it was justified and think the same today, I was an eleven year old kid then, remember?).
Commodore 64 was an utterly open platform. Even from BASIC, you could POKE around with it (for those not in the know, "POKE" is a basic command for directly setting contents of a memory location). Even before I would have a Commodore 64 of my own (which I had to wait all the way until 1989 to be able to afford one; cool kids were playing with Amigas by then) I would have a book that contained a description of the BASIC, of the contents of both BASIC and System ROM (complete with all system calls and memory locations it used for settings, buffers, and so on), of all peripheral chips (VIC, SID, CIA1, CIA2 etc.), their mapped memory locations and how to program them, a complete reference of 6510 machine code, and finally, complete hardware schematics for the machine. If you had this book, you could do anything with the machine. Anything. And believe me, I did a lot. Won't go into details now as I'm well into get-off-my-lawn territory here.
That's why I'm disappointed that I won't be able to do a lot with this one, except play old games on my phone. I don't imagine that under Apple's regime I would ever be able to fire up Turbo Assembler and write some rasters, if you know what I mean. And that's what makes me sad.
Apple even only approved the emulator once the developer has disabled the BASIC, so it's not even the familiar C64 screen that welcomes you on startup. It looks like this instead:
However, turns out there's a workaround; the BASIC ain't disabled, it's just hidden. If you go to "Advanced" and turn on "Always show full keyboard", then launch a game, go to "EXTRA" keys, and tap "RESET", you end up with the familiar screen:
(Mind you, the problem with monitor mask clipping the edge pixels is particularly articulated in this picture - you don't really see the first character in the line!) I had a bit more fun with it though now that I had a command line. For starters, let's see if there's something on the "floppy disk":
Sure there is! Looks like a floppy for "Dragon's Den" (one of the games included with the app). Interestingly, it does not matter what game I started before resetting the emulator: the "floppy" in device 8 was always that of "Dragon's Den". I didn't find other numbered devices containing other games "floppies" either. For now. However, here's a kicker. (Or an easter egg if it's intentional, although I don't think it is.) If you load the program from the floppy and run it manually:
OMG, YOU GET AN INTRO!:
That's correct ladies and gentlemen - the developers seem to had trouble locating a genuine copy of the game, so they bundled one that was distributed by a cracker group. Back in the day it was customary for people who cracked games for, erm, "unofficial distribution" to prepend a small program to it called "intro" which was a colorful, blinking screen with greetings to fellow members of the trade and good chiptunes. They also usually led to menus for enabling cheats and sometimes on-screen instructions for the game in case the player didn't have the original game manual at hand. Like, if you're playing on an iPhone, right? (Honestly, if that wasn't there I would have never figured out this particular game on my own. So the intro actually added value for me. Also, the fact someone had this game in cracked form but not in uncracked form strongly suggests that if it weren't for crackers, we wouldn't have this game today. Just sayin'.)
Needless to say, the intro isn't there if you launch the game from the iPhone app menu.
Verdict? Nice easter egg with the intro, folks. Beautiful, polished iPhone application. Potential for more game titles to come. But if you were more than a game player on Commodore 64, don't hold your breath. You'll likely never be able to upload any of your your own old time C64 creations on the device to show it off. For that, you'll still need an emulator on your desktop OS.
I hope at least they'll release Elite, though. That'd be awesome.
Tuesday, March 17, 2009
Relativity of simultaneity
A problem with staying in a profession for long years is that there's less and less things that can truly excite or surprise you as the time passes. That's why I was particularly delighted to experience a moment of sudden enlightenment while listening to Rich Hickey talk about "Persistent Data Structures and Managed References" last thursday at QCon in London. The title of the talk might not sound exciting, but I have to tell you, if this were the only talk I attended, it alone would have been worth attending. I don't say this often. Slides are here, although you're missing out Rich's incredibly good verbal presenting style if you just read them, obviously.
Rich's view (with which I have to agree; in the time I knew Rich, I have yet to hear the first thing from him I'd disagree with, whether it be concurrency, memory models, or ideas for what to do in San Francisco if you only have a single day to explore it); so, Rich's view is that the idea of variables in traditional programming languages is broken, because they are based on a single thread of control at a time. This used to be true in single-threaded programs, and even to a degree for multi-threaded programs running on single-CPU systems.
He argues that we need well defined time models for state changes in our concurrent computational systems. A lack of a time model fails to capture the situation where two observers (threads, CPUs) can observe two events (changes of state) in different order.
Then it suddenly hit me! I learned this stuff in school! Relative timelines, observers... Rich is talking about relativity of simultaneity!
(Which is one of the simpler conseqeunces of Einstein's special theory of relativity.)
Wait a moment. Finding such a surprising parallel analogy between physical world surrounding us and our digital computational systems seems at first unlikely, but thinking more of it, it makes perfect sense. For event timelines to appear differently to different observers, we only need two prerequisites: truly parallel processing, and finite speed of change propagation. Both are true for the real world (quite massively parallel, and the changes propagate at at most the speed of light), and for digital computational systems (parallel already at two CPU cores, with changes occurring in CPU registers and needing time to propagate downstream the hardware memory model to be observable by other CPUs).
One of the wonderful things about Rich is that he's able to express these notions very clearly. It is all really obvious when you think about it, and I couldn't even say I wasn't aware of it for at least three years now, it's just that before hearing this talk, I always had a very slight hope that someday, someone (infinitely wiser than me) will somehow be able to, you know, solve this. Eliminate this problem. The thing I realize now is that it's inherent. You can no sooner eliminate relativity of simultaneity and the rest of consequences it brings to the table from our computational systems than you could cancel its effects in the physical world.
Rich does the only sane thing to do with a problem you can't eliminate - he embraces it. His programming language, Clojure, is an attempt at being the easiest way to write correct concurrent programs. Emphasis on all of "easy to write", "concurrent", and "correct". I learned about Clojure sometime last year when it came up at the jvm-languages mailing list. I read up the documentation on the website, and came away thoroughly impressed. Then I met Rich and saw a 30-minute Clojure presentation last year at the JVM language summit, saw the 60-minute version of the same talk now at QCon, and I'm still impressed. Clojure embraces the problem of concurrent state modifications by encapsulating the state into "references" - the only mutable data types in Clojure, which all point to immutable values. The references can change in time though to point to new values, and the big idea here is that each reference has defined concurrency semantics for how and when change occurs.
Now, there's likely a huge number of temporal semantics of how can some entity's state change. Clojure identifies several usual semantic categories though, and provides them for ease of use. We have synchronous coordinated (Clojure has a STM for that), synchronous uncoordinated, asynchronous uncoordinated, as well as isolated (thread local). I think you can build whatever you need using those.
Anyhow, during Rich's talk I had a moment of enlightenment. Not because I heard something entirely new, but because the pieces of the jigsaw puzzle finally fell into place and revealed a picture. The picture does away with a false hope I had, which is good as it is a clear and definite answer to a question, and having accepted it it is at least clear to me what's the direction forward. During the rest of the day, I was telling pretty much everyone I met and their dog about how I was shown the inherent manifestation of relativity in finite-speed concurrent systems. (Joe Armstrong was just smirking at it, probably thinking "you wet-behind-the-ears kids, this should've been obvious to you for decades", and then we proceeded making fun of all n-phase commit protocols, for any finite value of n).
On multicore systems, there's no longer absolute time. Every core runs on its own relative time and there's no definite sequencing of events. It's time we build this assumption into our code, and not use tools and methodologies that are ignorant of this fact. And as Rich would point it out, writing your code in Clojure is a good way towards the goal.
Friday, January 30, 2009
Speaking at QCon
Just a heads up that I'll be speaking at QCon in London this March, about using JavaScript in the enterprise. Looking forward to meet you all there.