Thursday, September 09, 2010

Running YourKit UI remotely

I got into a somewhat unique situation few weeks ago: I had a JVM heap dump 16GB in size that I had to analyze using a memory profiler. It was obviously hopeless to analyze it on my laptop which only has 4GB of RAM. Fortunately, I was given access to a computer half the world away that had 64GB of RAM. What I did was I moved my YourKit profiler instance onto that machine and tried to run it using X.11 tunneled through SSH.


The good news is: it works.

The bad news is: it needs tweaking to work.

First thing I run into was that the GUI was hideously slow. I would click on a button in the profiler, and feedback would take several minutes. I'm not joking. Several minutes. Googling revealed a Sun Bug Database item "Antialiasing/Compositing is slow on remote display". It suggests that to run GUI Java apps over X.11, one should add

-Dsun.java2d.pmoffscreen=false

as a JVM startup flag. I added it to profiler's yjp.sh launch script, and Holy Moly, it became much better. Granted, it was now rendering screens progressively from top to bottom, but it did it in matter of seconds, not minutes. It now felt like something out of 1999 instead of 1993. Further Googling for various combinations of "X.11" and "slow" and "ssh" turned out sites that suggested it's possible to make x.11 over a bit faster, by telling SSH to use Arcfour or Blowfish ciphers instead of AES, as well as to compress the traffic:

ssh -c arcfour,blowfish-cbc -C -X host

This led to further improvements; the progressive rendering got faster by another factor of two. That's as good as it gets. Now it feels like something out of circa 2001. I can work with that.

Monday, June 28, 2010

I work for Twitter now

I work for Twitter now.

"Why?"

Twitter is an amazing company that provides an amazing service to the humankind. I see it as a fundamental new modus of communication that's quickly becoming part of the global human culture. Whether the earthquake in Haiti or the post-election uprising in Iran, Twitter has proven to be a reliable and prompt medium for getting the latest news on what's happening in the world, directly from the people most involved in the events. (While simultaneously still allowing you to to tell your friends what you had for breakfast, of course.) I appreciate its singular focus on doing one thing right, and that it managed to remain simple to use. Twitter is experiencing a 16% month-over-month (that's exponential!) traffic increase, and I'm very excited at the prospect of throwing my weight behind making it able to cope with this welcome challenge. The company is chock full of extremely bright people that I will no doubt be a delight to work with, and maybe some of their awesome will rub on to me too...

I want to change the world for the better and make a positive influence on people's lives. Often, the most significant changes are the ones people don't even notice, because they make Things Work As They Are Supposed To. How big a change? I have no idea yet, but it could be vital. If it makes the difference between a smooth operation and a Fail Whale for someone caught in the next civil unrest who's using a momentarily available internet connection to let his family know he's okay, it can mean the world to them.

I appreciate Twitter's open-source fostering culture. Today, the barrier to entry for influencing lots of people's lives is actually pretty low. It's called Open Source Software, and it's easy to participate and create serious value. I count that for every hour I spent on my OSS projects, I saved in thousands of man hours to my users, a number that will keep growing until the last of my code goes out of use (will be a while...). The value leverage is just insane. I just never figured out how to make it pay the bills, so I can't devote myself 100% to it. Twitter plays great in the OSS ecosystem, so I hope that I'll be able to create value in two streams: improving Twitter through the OSS it relies on, and as a consequence, improving the said OSS.

"So are you moving to Bay Area?"

Yes. We're working on my visa, and once that's approved (hopefully around October), I'll move with my wife and children to San Francisco to join the company in person. Until then, I'm working remotely from home in Hungary. I'm absolutely thrilled at the prospect of living there, and I'm very much looking forward to this new adventure.

Monday, February 22, 2010

Windows Live nightmare

Last week I purchased a copy of Gears of War - it cost me some 12 bucks in local currency, one of the benefits of only buying 2-3 year old games. Yup, I'm the guy from this comic. So it happened that this weekend I decided to play a bit with it, so went to install it on a Sunday afternoon.


As installs go, it was uneventful, and I got to launching the game.

First snag, it tells me that it needs to install an "AMD Processor Update" otherwise it will crash during gameplay. Okay, let it install it. Reboot machine. Restart the game. I can play now, right?

Wrong.

It again tells me it needs to install the same update. Uh-oh. Okay, give it another try. Reboot machine. Restart the game.

I get to the main screen. Fine. I can play the game now, right?

Wrong.

I attempt to start the campaign, and it tells me that unless I have a Windows Live profile, I won't be able to save games!

Excuse me?!

*Pause*

Oh well, let's create a Windows Live account, I can live with that. It opens a browser, I fill out few forms, and off we go. I can play the game now, right?

Wrong.

Next it tells me that now it needs to update the local Windows Live client binaries. It tells me that I can choose to not update now, but I won't be able to play the game until I do. Yeah, that's great choice, indeed, how very thoughtful of them to give me options.

We're now at about one hour mark from the moment I inserted the install DVD. But once it got updated, I can finally log in and play the game, right?

Wrong.

It now tells me that Windows Live is "not supported in my region." How splendid! If I live in Hungary, I'm not allowed to save my bloody games, is that what you are telling me? Apparently, yes.

Another hour of googling and forum reading ensues. I end up on some dead links at sites named xbox.com and gamesforwindows.com and the likes where I can supposedly change my region but in reality I can't. I end up on a site where finally I can do it (I think it's some long domain name with words "windows" and "account" and "services" in it in some order). I lie that I'm in UK. It seems to accept it. So, I can now play the game, right?

Wrong.

They still claim I'm in "unsupported region". Turns out, they're protecting against such smartypants as I am, naive people who believed that Internet is global and makes it irrelevant where do you sit physically. Nope. No matter what you change your location to, the initial location when you created your account sticks with you. At one forum, a user tells me that if I were to create a different account now (forget the one I already have, it's tainted forever he tells me), and initially log in through some obscure UK site, and supply plausibly looking UK address and phone number then I could... sheesh.

I give up, play Bioshock for 15 minutes (all I had left of my evening), go to sleep, cursing my actions that led me to being exposed to a Microsoft technology again.

Next morning, more forum trawling. Turns out that you can create a "local" or "offline" profile, and that it lets you save the game locally. Hey, that's all I really ever wanted! So, how do I do it? Because, sure enough, I didn't see this anywhere on the UI when they were gently pushing me in the direction of creating a Live account.

Turns out, you need to click some kind of a "Learn More" button during the account creation process, and the help document will contain a link (only if you scroll to the bottom of the page) that will open a dialog to create a local account. At least it didn't say "Beware of the tiger" on the door...

So, let me get this straight: some Microsoft genius, in an attempt to drive as many users as possible into signing up for their Live service thoroughly hid the offline account creation, even if this will cause genuine grief for a subset of those users that are in an "unsupported region".

To use an analogy (it'll be a car analogy, of course), it's as if you'd have a four-lane highway, and all traffic signs point you to drive there to reach your destination, and of course you do, but after a bit of a drive, there's suddenly a blockade on the road, and since the license plates on your car are from an "unsupported region", you aren't allowed to drive further, and the grumpy officers won't tell you how you're supposed to get to your destination, only that they have specific instructions to not let pass the lower castes such as you are. So you drive back, and eventually find some secret vagabond scratchings on a wall (that'd be the forum posts) that point you to an otherwise unmarked dirt path through a forest, that will let you eventually reach your destination.

In the end, I have my local profile, so I can play the game now, right?

Right. Except I feel like I've been kicked in teeth by the whole experience so far, so it'll be some time before I'll have any desire to go back to this. Honestly, yesterday evening I was an inch from ritually burning the DVD. And I know that next time I hear that a game requires Windows Live in any capacity, that'll be a good sign for me to steer clear of it.

Monday, November 16, 2009

November moment

Oh, the sudden urgency in the people’s steps in the street as the raindrops start hitting the pavement. I don’t have an umbrella, so I’m drawing close to the buildings on the way home for some little shelter their walls and balconies can provide. I thought these buildings familiar, but either because I’m passing so close to them, or because of the rain, they smell different. It’s surprising, but pleasant and triggers some previously deeply buried childhood memories.

Tuesday, September 08, 2009

Review of the Commodore 64 emulator for the iPhone

UPDATE: Apple yanked the app because of the BASIC interpreter hole, and the developers have plugged it and resubmitted the app. Way to go, Apple. I mean, what harm could that BASIC interpreter do? It has no means of loading external code - no access to local filesystem of the underlying OS, no network connectivity, nothing. Are they afraid I'll manually type-in a program from a listing published in a magazine or something? I got tons more sense of childlike wonder from toying with the interpreter than I could ever from playing Jack Attack and Dragon's Den combined! Sheesh... On to the original review:

Against better judgment, I ended up taking my iPhone with me to the bathtub yesterday evening. I needed to check something really urgent, like the IMDB rating of "Watchmen" or similar (just finished the book). You know, stuff that you risk water damage to your high-tech gadget. I ended up browsing my RSS feeds instead though and spotted news about Commodore 64 emulator being approved by Apple and available on the App Store.
Now, you need to know that the first home computer I've ever been exposed to was a Commodore 64, at around age of eleven. I've got an enormous amount of emotional attachment to it, being enthusiast hacker for it as a kid. So, I was sitting there in my bathtub thinking, "this would be nice to get" and then realized "wait a minute, I can get it; right here, right now."
Let's pause for a moment to think about it folks, just how big enabler Internet is in our lives. I mean, there I was, purchasing a Commodore 64 simulator from my bathtub, and a minute later, I launched it and started toying around with the UI.

That's where disappointment started to set in.

Oh, don't get me wrong. Developers of this application (and the excellent open source Frodo C64 simulator they based it on) deserve every praise possible. I've used Frodo several years earlier, and it is painstakingly precise reproduction of a Commodore 64, up to the point of hardware side effects. The iPhone application is tasteful, snappy, responsive, and the UI is okay.
I do actually have one gripe with the UI, though. See, Commodore 64 horizontal resolution is 320 pixels, exactly the same as that of the iPhone. Since the developers added a graphic around it resembling the venerable Commodore 1701 monitor, the mask obscures some of the pixels at the edge. This is very annoying actually, since lots of games display important information around the top and bottom edges. These machines back then emitted an image that had two distinct areas named "border" and "paper". Border was the outermost part of the image and had a single attribute: color. Paper was a smaller rectangle within the border where the image was actually displayed. The purpose of the "border" was specifically to act as a disposable margin that could be clipped to size by your monitor (in many cases, a TV screen) so that you don't lose paper pixels. The developers of the iPhone emulator thus improved aesthetics at expense of functionality.

But that's not what I'm disappointed with.

Honestly, I could be disappointed by the initial selection of games; they're all ancient simplistic titles even by Commodore 64 standards, old 1983 titles where developers still didn't know how to utilize the platform to its potential. There will be more games available later, so I'm not too worried about that. (I hope they will release Elite...)

That's not what I'm disappointed with.

What I'm disappointed with is the lack of the Commodore 64 essence, as I experienced it. That essence, ladies and gentlemen, is hacking. Even as a kid, I disliked it when people were using the machine exclusively as a game console. As a bit of personal history, let me tell you that my family was not exactly wealthy when I was a kid, and that too is an understatement. After I became interested in computers (following a series of articles in a monthly magazine for school kids), I would write my little programs (initially BASIC, later 6510 machine code) in a notebook, then annoying one of the three people in our village who had a machine to give me a bit of a time to try them out. I saw the potential for creating my own stuff in these machines, and couldn't imagine why would someone, who is fortunate enough to actually be able to afford one, use it only to play games on it. I disliked people who didn't see beyond gaming (I'm not saying it was justified and think the same today, I was an eleven year old kid then, remember?).

Commodore 64 was an utterly open platform. Even from BASIC, you could POKE around with it (for those not in the know, "POKE" is a basic command for directly setting contents of a memory location). Even before I would have a Commodore 64 of my own (which I had to wait all the way until 1989 to be able to afford one; cool kids were playing with Amigas by then) I would have a book that contained a description of the BASIC, of the contents of both BASIC and System ROM (complete with all system calls and memory locations it used for settings, buffers, and so on), of all peripheral chips (VIC, SID, CIA1, CIA2 etc.), their mapped memory locations and how to program them, a complete reference of 6510 machine code, and finally, complete hardware schematics for the machine. If you had this book, you could do anything with the machine. Anything. And believe me, I did a lot. Won't go into details now as I'm well into get-off-my-lawn territory here.

That's why I'm disappointed that I won't be able to do a lot with this one, except play old games on my phone. I don't imagine that under Apple's regime I would ever be able to fire up Turbo Assembler and write some rasters, if you know what I mean. And that's what makes me sad.

Apple even only approved the emulator once the developer has disabled the BASIC, so it's not even the familiar C64 screen that welcomes you on startup. It looks like this instead:



However, turns out there's a workaround; the BASIC ain't disabled, it's just hidden. If you go to "Advanced" and turn on "Always show full keyboard", then launch a game, go to "EXTRA" keys, and tap "RESET", you end up with the familiar screen:



(Mind you, the problem with monitor mask clipping the edge pixels is particularly articulated in this picture - you don't really see the first character in the line!) I had a bit more fun with it though now that I had a command line. For starters, let's see if there's something on the "floppy disk":



Sure there is! Looks like a floppy for "Dragon's Den" (one of the games included with the app). Interestingly, it does not matter what game I started before resetting the emulator: the "floppy" in device 8 was always that of "Dragon's Den". I didn't find other numbered devices containing other games "floppies" either. For now. However, here's a kicker. (Or an easter egg if it's intentional, although I don't think it is.) If you load the program from the floppy and run it manually:



OMG, YOU GET AN INTRO!:



That's correct ladies and gentlemen - the developers seem to had trouble locating a genuine copy of the game, so they bundled one that was distributed by a cracker group. Back in the day it was customary for people who cracked games for, erm, "unofficial distribution" to prepend a small program to it called "intro" which was a colorful, blinking screen with greetings to fellow members of the trade and good chiptunes. They also usually led to menus for enabling cheats and sometimes on-screen instructions for the game in case the player didn't have the original game manual at hand. Like, if you're playing on an iPhone, right? (Honestly, if that wasn't there I would have never figured out this particular game on my own. So the intro actually added value for me. Also, the fact someone had this game in cracked form but not in uncracked form strongly suggests that if it weren't for crackers, we wouldn't have this game today. Just sayin'.)

Needless to say, the intro isn't there if you launch the game from the iPhone app menu.

Verdict? Nice easter egg with the intro, folks. Beautiful, polished iPhone application. Potential for more game titles to come. But if you were more than a game player on Commodore 64, don't hold your breath. You'll likely never be able to upload any of your your own old time C64 creations on the device to show it off. For that, you'll still need an emulator on your desktop OS.

I hope at least they'll release Elite, though. That'd be awesome.

Tuesday, March 17, 2009

Relativity of simultaneity

A problem with staying in a profession for long years is that there's less and less things that can truly excite or surprise you as the time passes. That's why I was particularly delighted to experience a moment of sudden enlightenment while listening to Rich Hickey talk about "Persistent Data Structures and Managed References" last thursday at QCon in London. The title of the talk might not sound exciting, but I have to tell you, if this were the only talk I attended, it alone would have been worth attending. I don't say this often. Slides are here, although you're missing out Rich's incredibly good verbal presenting style if you just read them, obviously.

Rich's view (with which I have to agree; in the time I knew Rich, I have yet to hear the first thing from him I'd disagree with, whether it be concurrency, memory models, or ideas for what to do in San Francisco if you only have a single day to explore it); so, Rich's view is that the idea of variables in traditional programming languages is broken, because they are based on a single thread of control at a time. This used to be true in single-threaded programs, and even to a degree for multi-threaded programs running on single-CPU systems.

He argues that we need well defined time models for state changes in our concurrent computational systems. A lack of a time model fails to capture the situation where two observers (threads, CPUs) can observe two events (changes of state) in different order.

Then it suddenly hit me! I learned this stuff in school! Relative timelines, observers... Rich is talking about relativity of simultaneity!

(Which is one of the simpler conseqeunces of Einstein's special theory of relativity.)

Wait a moment. Finding such a surprising parallel analogy between physical world surrounding us and our digital computational systems seems at first unlikely, but thinking more of it, it makes perfect sense. For event timelines to appear differently to different observers, we only need two prerequisites: truly parallel processing, and finite speed of change propagation. Both are true for the real world (quite massively parallel, and the changes propagate at at most the speed of light), and for digital computational systems (parallel already at two CPU cores, with changes occurring in CPU registers and needing time to propagate downstream the hardware memory model to be observable by other CPUs).

One of the wonderful things about Rich is that he's able to express these notions very clearly. It is all really obvious when you think about it, and I couldn't even say I wasn't aware of it for at least three years now, it's just that before hearing this talk, I always had a very slight hope that someday, someone (infinitely wiser than me) will somehow be able to, you know, solve this. Eliminate this problem. The thing I realize now is that it's inherent. You can no sooner eliminate relativity of simultaneity and the rest of consequences it brings to the table from our computational systems than you could cancel its effects in the physical world.

Rich does the only sane thing to do with a problem you can't eliminate - he embraces it. His programming language, Clojure, is an attempt at being the easiest way to write correct concurrent programs. Emphasis on all of "easy to write", "concurrent", and "correct". I learned about Clojure sometime last year when it came up at the jvm-languages mailing list. I read up the documentation on the website, and came away thoroughly impressed. Then I met Rich and saw a 30-minute Clojure presentation last year at the JVM language summit, saw the 60-minute version of the same talk now at QCon, and I'm still impressed. Clojure embraces the problem of concurrent state modifications by encapsulating the state into "references" - the only mutable data types in Clojure, which all point to immutable values. The references can change in time though to point to new values, and the big idea here is that each reference has defined concurrency semantics for how and when change occurs.

Now, there's likely a huge number of temporal semantics of how can some entity's state change. Clojure identifies several usual semantic categories though, and provides them for ease of use. We have synchronous coordinated (Clojure has a STM for that), synchronous uncoordinated, asynchronous uncoordinated, as well as isolated (thread local). I think you can build whatever you need using those.

Anyhow, during Rich's talk I had a moment of enlightenment. Not because I heard something entirely new, but because the pieces of the jigsaw puzzle finally fell into place and revealed a picture. The picture does away with a false hope I had, which is good as it is a clear and definite answer to a question, and having accepted it it is at least clear to me what's the direction forward. During the rest of the day, I was telling pretty much everyone I met and their dog about how I was shown the inherent manifestation of relativity in finite-speed concurrent systems. (Joe Armstrong was just smirking at it, probably thinking "you wet-behind-the-ears kids, this should've been obvious to you for decades", and then we proceeded making fun of all n-phase commit protocols, for any finite value of n).

On multicore systems, there's no longer absolute time. Every core runs on its own relative time and there's no definite sequencing of events. It's time we build this assumption into our code, and not use tools and methodologies that are ignorant of this fact. And as Rich would point it out, writing your code in Clojure is a good way towards the goal.

Friday, January 30, 2009

Speaking at QCon

Just a heads up that I'll be speaking at QCon in London this March, about using JavaScript in the enterprise. Looking forward to meet you all there.