Sunday 28 February 2016

Leviathan Wakes (The Expanse #1), by James S.A. Corey

Book cover I absolutely love the TV series The Expanse and so, after ten torturous weeks in which I would alternatively get filled with joy at the release of another episode, then fall into despair when it ended, I've decided to cut out the middleman and read the damn book. And now that I've read it, I have to say that I am really glad to have seen the show first. It's not that it is a bad book - it is not, it's very good - but the show is better and having the characters in the series blend with the ones on the page is turning them into something more fleshed out, more complex. And I am not the only one thinking that.

For example, in the book the viewpoint alternates, chapter by chapter, between Miller and Holden. The chapters don't even have other titles other than these two names. In the book Naomi is a slight romantic interest, but nothing more than that. Amos is just there, doing stuff. Alex is barely fleshed out. It is Holden who is the undisputed captain of the Rocinante and that's it. Similarly, Miller's partners or even Anderson Dawes are just shadows of the complex characters they appear in the TV show. One of the things that I really loved about the show was that it was reasonably accurate scientifically and I kind of thought the book would be even more detailed about it. In the contrary, it was not. The authors even said in an interview that they never intended to make a hard sci-fi thing, but just to tell the stories of people in the less used fictional period between stuck-on-Earth and interstellar humanity. The weird language melange that is shown in the TV series is only vaguely mentioned in the books. Also the Earth lady in the show is not in this book at all.

But enough about the show. Leviathan Wakes is the first in The Expanse series of more than five books, written by James S.A. Corey, which is actually a pen name for Daniel Abraham and Ty Franck, both collaborators of George R.R. Martin (Franck was his personal assistant, for example). More than five books because there are also some interstitial short stories and I am sure they are going to write some more stuff. The plot of the book is about people living and working in space. You have densely populated Earth, a colonized Mars that is undergoing terraforming and the Belt, people living on the large asteroid belt between Mars and Jupiter. And of course, people being people, they hate each other. Enter an unknown entity that lights the powder keg for reasons unknown that are slowly revealed as the book goes on.

Miller is the typical noir detective, cynical, not trusting, obsessive to the extreme and trying to prove to the world and himself that he's still "got it". Holden is the typical good guy, wanting to save everybody and hoping that all will be well if people just know the truth. Their dynamic is interesting, but for me the story was more captivating when they hadn't met each other. The first season of the show ends when the book is about half way through, but from then there is a period of inactivity, one of those parts when everybody knows what is going on and still they have to go through the movements. In real life this is where the role of the characters would be over, but since it is a book, the same characters continue the story, although it feels a bit disconnected. The ending of the book is epic, in its grand scale, but almost boring from a literary standpoint.

Bottom line: I will continue reading the next books of The Expanse, however I have to say that I am shocked at the difference in quality between the show and the book. Probably it is much easier to do that when you have good actors carrying the load and a lot more book material to work with before you start envisioning your world, but still awe is the feeling I am getting from the TV show.

Saturday 20 February 2016

Vita de Vie - Bucuresti - 20 Februarie 2016 - 20 de ani de Vita de Vie

If there is anything that I am forced to say about Adrian Despot, the frontman of Vita de Vie, is that he is a true artist. The concert tonight was spot on, even if I am not a fan of the band. The guest bands were pretty good, too, but I have to admit that for most of them I was waiting for them to stop playing so I can listen to the great playlist from DJ Hefe. The audience was really mixed, ranging from little kids to old people. It felt great to see all these people singing along and reliving some of the greatest hits of the band.

I started watching the concert online. It was kind of extreme to go there at 16:00 and stay until 23:00, especially since I was worried about the food/drink/toilet situation and there was an afterparty as well. I have to say that all my worries were for naught. Really decent access to the food and drink stands and there was no queue at the toilets outside. Of course, the drinks and food were shitty and overpriced, but that was to be expected. It also was a really wonderful thing to stand in the middle of a crowd of people and not feel like I was smoking a cigar. The law against smoking in public places has finally reached Romania so it felt really wonderful.

By the time we got to the concert hall - umm, heated tent, but it was better than it sounds - the last band before Vita de Vie was playing, the rather good Relative, from Cluj. Energetic, professional, kind of bad public speakers, but they have time to improve. They were pretty emotional about their first venue in Bucharest and performing before so many people, so they were sweet. Then the main show started, with light shows, projections and a volume that felt like twice as loud as the bands before. My ears are still ringing.

Unfortunately something happened that ruined my evening, so I went home after the concert, rather than go to the after-party at Fabrica. I wish I was in the mood for that, but well, shit happens. So yeah, the show was great, the music pretty good - although I felt like the band would have done a better job with another lead singer :) The point is that Vita de Vie, like any other band - let's be honest, is a project. Individual people don't count unless they push the project further, make it better somehow. Adi Despot made that obvious when he called the previous members of the band to play some songs, as well as some collaborators in sideprojects started by current members of the band. Like him or not, he did bring showmanship to the project and he deserves to be the frontman.

Bottom line, I was impressed by the way the concert was organized (I am used to those really bad things where people just stand brushing against each other, suffocating in smoky improperly ventilated places, trying their best not to slip into the beer and piss left by people who couldn't get fast enough to the few malfunctioning toilets provided). I was also impressed with the guest bands, doing a really professional job, even if they have a lot to learn still.

You might be interested in the Facebook link of the event.

Thursday 18 February 2016

Firebase - Queries

Firebase logo In the previous post I was discussing Firebase, used in Javascript, and that covered initialization, basic security, read all and insert. In this post I want to discuss about complex queries: filtering, ordering, limiting, indexing, etc. For that I will get inspiration (read: copy with impunity) from the Firebase documentation on the subject Retrieving Data, but make it quick and dirty... you know, like sex! Thank you, ma'am!

OK, the fluid interface for getting the data looks a lot like C# LInQ and I plan to work on a Linq2Firebase thing, but not yet. Since LInQ itself got its inspiration from SQL, I was planning to structure the post in a similar manner: how to do order by, top/limit, select conditions, indexing and so on, so we can really use Firebase like a database. An interesting concept to explore is joining, since this is an object database, but we still need it, because we want to filter by the results of the join before we return the result, like getting all the transaction of users that have the name 'Adam'. Aggregating is another thing that I feel Firebase needs to support. I don't want a billion records in order to compute the sum of a property.

However, the Firebase API is rather limited at the moment. You get .orderByChild, then stuff like .equalTo, .startAt and .endAt and then .limitToFirst and .limitToLast. No aggregation, no complex filters, no optimized indexing, no joining. As far as I can see, this is by design, so that the server is as dumb as possible, but think about that 1GB for the free plan. It is a lot.

So, let's try a complex query, see were it gets us.
ref.child('user')
.once('value',function(snapshot) {
var users=[];
snapshot.forEach(function(childSnapshot) {
var item=childSnapshot.val();
if (/adam/i.test(item.name)) {
users.push(item.userId);
}
});
getInvoiceTotalForUsers(users,DoSomethingWithSum);
});


function getInvoiceTotalForUsers(users,callback)
{
var sum=0;
var count=0;
for (var i=0; i<users.length; i++) {
var id=users[i];
ref.child('invoice')
.equalTo(id,'userId')
.orderByChild('price')
.startAt(10)
.endAt(100)
.once('value',function(snapshot) {
snapshot.forEach(function(childSnapshot) {
var item = childSnapshot.val();
sum+=item.price;
count++;
if (count==users.length) callback(sum);
});
});
}
}

First, I selected the users that have 'adam' in the name. I used .once instead of .on because I don't want to wait for new data to arrive, I want the data so far. I used .forEach to enumerate the data from the value event. With the array of userIds I call getInvoiceTotalForUsers, which gets all the invoices for each user, with a price bigger or equal to 10 and less or equal to 100, which finally calls a callback with the resulting sum of invoice prices.

For me this feels very cumbersome. I can think of several methods to simplify this, but the vanilla code would probably look like this.

Firebase - a free Javascript/Rest accessible cloud no SQL database - Introduction

Firebase logo I have been looking for a long time for this kind of service, mainly because I wanted to monitor and persist stuff for my blog. Firebase is all of that and more and, with a free plan of 1GB, it's pretty awesome. However, as it is a no SQL database and as it can be accessed via Javascript, it may be a bit difficult to get it at first. In this post I will be talking about how to use Firebase as a traditional database using their Javascript library.

So, first off go to the main website and signup with Google. Once you do, you get a page with a 5 minute tutorial, quickstarts, examples, API docs... but you want the ultra-quick start! Copy pasted working code! So click on the Manage App button.

Take note of the URL where you are redirected. It is the one used for all data usage as well. Ok, quick test code:
var testRef = new Firebase('https://*******.firebaseio.com/test');
testRef.push({
val1: "any object you like",
val2: 1,
val3: "as long as it is not undefined or some complex type like a Date object",
val4: "think of it as JSON"
});
What this does is take that object there and save it in your database, in the "test" container. Let's say it's like a table. You can also save objects directly in the root, but I don't recommend it, as the path of the object is the only one telling you what type of object it is.

Now, in order to read inserted objects, you use events. It's a sort of reactive way of doing things that might be a little unfamiliar. For example, when you run the following piece of code, you will get after you connect all the objects you ever inserted into "test".
var testRef = new Firebase('https://*******.firebaseio.com/test');
testRef.on('child_added', function(snapshot) {
var obj = snapshot.val();
handle(obj); //do what you want with the object
});

Note that you can use either child_added or value, as the retrieve event. While 'child_added' is fired on each retrieved object, 'value' returns one snapshot containing all data items, then proceeds to fire on each added item with full snapshots. Beware!, that means if you have a million items and you do a value query, you get all of them (or at least attempt to, I think there are limits), then on the next added item you get a million and one. If you use .limitToLast(50), for example, you will get the last 50 items, then when a new one is added, you get another 50 item snapshot. In my mind, 'value' is to be used with .once(), while 'child_added' with .on(). More details in my Queries post

Just by using that, you have created a way to insert and read values from the database. Of course, you don't want to leave your database unprotected. Anyone could read or change your data this way. You need some sort of authentication. For that go to the left and click on Login & Auth, then you go to Email & Password and you configure what are the users to log in to your application. Notice that every user has a UID defined. Here is the code to use to authenticate:
var testRef = new Firebase('https://*******.firebaseio.com/test');
testRef.authWithPassword({
email : "some@email.com",
password : "password"
}, function(error, authData) {
if (error) {
console.log("Login Failed!", error);
} else {
console.log("Authenticated successfully with payload:", authData);
}
});
There is an extra step you want to take, secure your database so that it can only be accessed by logged users and for that you have to go to Security & Rules. A very simple structure to use is this:
{
"rules": {
"test": {
".read": false,
".write": false,
"$uid": {
// grants write access to the owner of this user account whose uid must exactly match the key ($uid)
".write": "auth !== null && auth.uid === $uid",
// grants read access to any user who is logged in with an email and password
".read": "auth !== null && auth.provider === 'password'"
}
}
}
}
This means that:
  1. It is forbidden to write to test directly, or to read from it
  2. It is allowed to write to test/uid (remember the user UID when you created the email/password pair) only by the user with the same uid
  3. It is allowed to read from test/uid, as long as you are authenticated in any way

Gotcha! This rule list allows you to read and write whatever you want on the root itself. Anyone could just waltz on your URL and fill your database with crap, just not in the "test" path. More than that, they can just listen to the root and get EVERYTHING that you write in. So the correct rule set is this:
{
"rules": {
".read": false,
".write": false,
"test": {
".read": false,
".write": false,
"$uid": {
// grants write access to the owner of this user account whose uid must exactly match the key ($uid)
".write": "auth !== null && auth.uid === $uid",
// grants read access to any user who is logged in with an email and password
".read": "auth !== null && auth.provider === 'password'"
}
}
}
}

In this particular case, in order to get to the path /test/$uid you can use the .child() function, like this: testRef.child(authData.uid).push(...), where authData is the object you retrieve from the authentication method and that contains your logged user's UID.

The rule system is easy to understand: use ".read"/".write" and a Javascript expression to allow or deny that operation, then add children paths and do the same. There are a lot more things you could learn about the way to authenticate: one can authenticate with Google, Twitter, Facebook, or even with custom tokens. Read more at Email & Password Authentication, User Authentication and User Based Security.

But because you want to do a dirty little hack and just make it work, here is one way:
{
"rules": {
".read": false,
".write": false,
"test": {
".read": "auth.uid == 'MyReadUser'",
".write": "auth.uid == 'MyWriteUser'"
}
}
}
This tells Firebase that no one is allowed to read/write except in /test and only if their UID is MyReadUser, MyWriteUser, respectively. In order to authenticate for this, we use this piece of code:
testRef.authWithCustomToken(token,success,error);
The handlers for success and error do the rest. In order to create the token, you need to do some cryptography, but nevermind that, there is an online JsFiddle where you can do just that without any thought. First you need a secret, for which you go into your Firebase console and click on Secrets. Click on "Show" and copy paste that secret into the JsFiddle "secret" textbox. Then enter MyReadUser/MyWriteUser in the "uid" textbox and create the token. You can then authenticate into Firebase using that ugly string that it spews out at you.

Done, now you only need to use the code. Here is an example:
var testRef = new Firebase('https://*****.firebaseio.com/test');
testRef.authWithCustomToken(token, function(err,authData) {
if (err) alert(err);
myDataRef.on('child_added', function(snapshot) {
var message = snapshot.val();
handle(message);
});
});
where token is the generated token and handle is a function that will run with each of the objects in the database.

In my case, I needed a way to write messages on the blog for users to read. I left read access on for everyone (true) and used the token idea from above to restrict writing. My html page that I run locally uses the authentication to write the messages.

There you have it. In the next post I will examine how you can query the database for specific objects.

Wednesday 17 February 2016

Breakthrough (Breakthrough book 1), by Michael C. Grumley

book cover I wasn't expecting much from this book, as another from the same page offering free books from their authors was kind of disappointing. However, this is a true book: it is long enough, well written, with developed characters and with an end that delivers closure to all the story arcs. Indeed, it closes all of them so well that is kind of weird to see that it is part of a series, containing the same characters no less. I mean, come on, how many times can the same people save the planet? Shut up, Marvel!

It was surprising to me to find out that Breakthrough was Michael C. Grumley's debut book. It is professionally written. Nothing exceptional, mind you, but nothing you can possibly find wrong with it. And the subject of the book was complex and interesting, involving talking dolphins, undersea aliens, covert military operations (no, it is not a Seaquest ripoff), which reminded me a little of Creatures of the Abyss.

The ending was a bit rushed, I guess, and contained that annoying trope "You are not yet ready, humans!". Fuck you, aliens, if all you've got to show for your evolution are plans to either destroy or patronize us! Plus some crowd pleasing death avoidance which felt wrong. But overall it was a good book, way above what I would call average. Since it is offered for free, you can download it and read it right now. And if you like it, the author offers even more free stuff on his site.

Monday 15 February 2016

Building robust software

the logo is from somewhere else, but it's a pic! I was reading this summary of a talk that Dr. Gerard Holzmann held at USENIX Hot Topics in System Dependability mini-conf on 7 Oct 2012 in Hollywood, California. In it there is a link to what the people in the JPL decided to use as the core of the coding standard: The Power of 10. Yeah, it sounds like a self-help system for addicts, but in fact it is a very smart idea. You see, when you code for the JPL you are talking about code that you will design and test on Earth, then run in space, often years after first developed. It needs to be robust, it needs to be as safe as possible and to make easy detecting problems early on. They tried with a style coding standard, but they failed, mostly because people were not being able to follow all the rules they decided on. Here comes the brilliant idea of taking the most risk alleviating ten coding rules and make it a kind of core of their development style. A form of software ten commandments, if you will.

Some of the rules there are quite counterintuitive. You may check them in link format here and in PDF format here. I was particularly interested in rules 2 and 3: allocate everything you need before you run the program (so eliminate things like more memory allocation or garbage collection) and giving all loops an upper bound (so make sure there will never be an infinite loop). The others are either common sense or already implemented in modern programming languages.

If I were to implement this, I would try to encapsulate the idea of finite loops, so instead of foreach/for loops I would use a class with Foreach/For methods (akin to Parallel). The memory allocation thing is trickier in .NET. The idea of garbage collector is already built into the system. The third rule in P10 says "Memory allocators, such as malloc, and garbage collectors often have unpredictable behavior that can significantly impact performance". I wonder if there is any way to quantify the performance losses coming from the framework memory allocation and garbage collection. As for disabling this behavior, I doubt it is even possible. What I could do is instantiate all classes used for data storage (all data models, basically) I will ever need at some initialization stage, then eliminating any usage of new or declaring any new objects and variables of that sort. It kind of goes against the tenets of OOP (and against P10's rule number 6, BTW), but it could be interesting to experiment with.

What do you think? Anyway, feel free to ignore my post, but read the document. People at JPL are not stupid! I loved this minimalist idea they used: just reduce all coding rules to the more important ten.

Google trend effect on page views?

Snowball Google I have implemented a system that logs what people do on my blog, with the intent of making it more useful to my readers. In doing so I created a live dashboard where people going and leaving are displayed in real time. The conclusion is pretty humbling, but I have also noticed a pattern that might reflect badly on the state of the Internet today.

The conclusion I was talking about is that, even if I write about a lot of things, from books to software, from WPF to Javascript, the most visited posts by far are about why the Bittorrent client gets stuck, how to remove ads by installing Privoxy and Sift3, my string comparison algorithm. All of that info one can get from the Popular posts column in the right of the blog, but I had no idea how many people visit it only to find how they can download their movies faster!

And then there are the programming blog posts. I am filled with pride when people open a link to learn something from my experiences. And then I see that they are looking at the posts about Crystal Reports, AjaxControlToolkit and the old ASP.Net Ajax calls. Occasionally they come for the WPF bit, which is great, but the conclusion is clear: people are mostly interested in the old posts, the ones describing older technologies that no one is talking about anywhere anymore. True, I have not posted anything significant in the last two years, but still, I feel disappointed. My blog's merit here seems to be that it is still online!

But then I realized something else. Sometimes I feel joy at seeing that a visitor opens a post that no one has opened recently. Yet, in a very short time, other people are starting to open the same link. It has happened repeatedly several days in a row, so it can't be a coincidence. And people are coming from all over: Canada, US, Brazil, Mozambique, Ghana! I can only explain it with the theory that once visited, a link increases in visibility, its Google rank goes up, thus passing a threshold that makes it appear on the first search pages. It is a snowball effect, which in part I understand and agree with, but can't stop wondering if it doesn't apply everywhere. Instead of going for the relevance that Google and other big search engines aspire towards, they cheat by treating each click as a Facebook Like! More people read it, so more people should read it, which they do, and so on and so on.

The bottom line is that I wouldn't want to see a race towards a common goal be treated as a common race towards a goal. Let all pages share the glory, rank them based on content, not the preferences of people searching for stuff. How long before Google will helpfully suggest to me to go download a movie rather than search for something for work?

Saturday 13 February 2016

Better World (Legacy Code prequel freebie), by Autumn Kalquist

book cover I've got this from a website where several self published authors shared free e-books for promotion purposes. I chose Better World because of its description: "The last humans spent centuries searching for a new Earth. Now they face extinction.
For three hundred years, arks have carried the last remnants of humanity through dark space. The ships are old, failing, and every colonist must do their duty to ensure the fleet’s survival." Pretty cool premise, if you ask me.

Now, Better World is not accidentally a free book. It is short, ends with a "to be continued" and has no other purpose than to pull the reader into Autumn Kalquist's Legacy Code series. To me it felt a bit amateurish, which is weird. I would have thought something that would pull your audience into your work should be better edited, if not better written.

The basic plot revolves around this 18 years old girl called Maeve, a low class citizen of the colony ships fleeing Earth to find a better world. The story starts with her trying to kill herself, while a dogooder boy who is obviously hot for her stops her at the last minute. I found it interesting that the heroine of the story starts off as weak, egotistical, scared, with low self esteem and living in a world where she is pretty much powerless. Everything that - if the book were written by a guy - would have prompted people to denounce his misogynistic view of the world. Yet every writing book worth mentioning affirms that the character has to begin as powerless and defective in order to evolve. And indeed, by the end of the book you find that Maeve realizes her own strength and courage when faced with true challenges, not just with teenage angst.

However, the scenes lacked power and, whenever something interesting happened, the author introduced new characters and new ideas instead of focusing on the potential of the current situation. Maeve wants to kill herself, a savior male is introduced. She rebels against authority, a younger more naive girl is introduced in order to suffer the consequences for her. The young girl gets hurt, a love interest - another girl - appears out of nowhere to take away focus from the shame and guilt. An "enforcer", a weak minded man drunk on his policing power, is making her life hell, she reminisces about her dead parents, killed by another enforcer's decision in the past. Every single time the plot was getting close to good, something was introduced that devastated the tension and the potential and gave the reader the impression that the story evolved as Kalquist wrote, with no clear idea of who her characters were or what the final shape of the plot will be.

And then there was the climax, the moment I was waiting for, when our hero gets stuck on an unforgiving planet with her torturer as her only ally... and they just walk a little to another group of people where she shows how good she is at fixing things. So much potential down the drain. Bottom line: the author comes off as a beginner in writing, but at least she is not pretending her work is the greatest and/or puts her friends to post positive reviews. Even with this short story I could see the wheels turning smoother and smoother as I went along, which probably means her writing will improve. Unfortunately, as standalone work, Better World is not more, not less than space pulp fiction, with no real impact behind the characters or the storyline.

Thursday 11 February 2016

The Ann Leckie Ancillary series... it sucks!

all three book covers I really missed reading a good science fiction book and when I was in this vulnerable mental state I stumbled upon this very positive review on Ars Technica recommending Ann Leckie's trilogy Ancillary. Ars Technica is one of the sites that I respect a lot for the quality of the news there, but I have to tell you that after this, my opinion of them plummeted. Let me tell you why.

The only remotely interesting characteristics of the Ancillary series is the premise - that an AI gets trapped in the body of an "ancillary" soldier that was used as only a physical extension among many others - and the original idea of using no gender when talking about people. You see, in the future, the empire language has lost its need of defining people by gender and therefore they are all she, mother, sister, etc. It is important that the genre is translated into our backward English is all female, as to balance the patriarchal bias of our society. Way to go, Ann! The books also won a ton of awards, which made me doubt myself for a full second before deciding that notorious book awards seem to be equally narrow in focus as notorious movie awards.

Unfortunately, after two books that only exposed antiquated ideas of space operas past, boring scenes and personal biases of the author, I decided to stop. I will not read the third book, the one that maybe would give me some closure as the last in the series. That should tell you how bad I think the books were. On a positive note it vaguely reminded me of Feintuch's Seafort Saga. If you want to read a similarly out of date space opera, but really good and captivating, read that one.

You see, it all happens on ships and stations, where the only thing that doesn't feel like taken from feudal stories are... wait... err... no, there is nothing remotely modern about the books. The premise gets lost on someone who focuses exclusively on the emotions of the Artificial Intelligence, rather than on their abilities or actual characteristics. If I were an AI, I would consider that discrimination. The same ideas could be put in some magical kingdom where magical people clone themselves and keep in touch. I don't know who invented this idea that the future will somehow revert to us being pompous boring nobles that care about family name, clothes, tea and saucer sets (this is from the books, I am not making it up), but enough with it! We have the Internet. And cell phones. That future will not happen! And if it would, no one cares! The main character acts like a motherly person for stupid or young people, no doubt reflecting Leckie's mood as a stay-at-home mom at the time of writing the books. You can basically kill people with impunity in this world of hers, if you are high enough on the social ladder, but swearing is frowned upon, for example.

OK, ranted enough about this. I don't care that her vision of the future sucks. I wouldn't have cared if her writing was bad - which it isn't. It's not great either, though. I do care when I get flooded with review titles like "The book series that brought space opera into the 21st century", by Annalee Newitz, or "Ancillary Justice is the mind-blowing space opera you've been needing", by Annalee Newitz, or "Why I’m Voting for Ann Leckie’s Ancillary Justice", by Justin Landon - a friend of Annalee Newitz' from the Speculative Fiction compilations, and "A mind-bending, award-winning science fiction trilogy that expertly investigates the way we live now.", by Tammy Oler, who is writing with Annalee Newitz at Bitch Media. Do you see a pattern here?

I have to admit that I think it is a feminism thing. So enamored were these girls of a story that doesn't define its characters by gender, that they loved the book. Annalee's reviews, though, are describing the wonderful universe that Leckie created, with its focus on social change and social commentary, and how it makes one think of how complex things are. I didn't get that at all. I got the typical all powerful emperor over the space empire thing, with stuck up officers believing they know everything and everything is owed to them and the "man/woman of the people" main character that shows them their wicked ways. The rest is horribly boring, not because of the lack of action, but because of the lack of consequence. I kind of think it's either a friend advertising for another or some shared feminist agenda thing.

Bottom line: regardless of my anger with out of proportion reviews, I still believe these books are not worth reading. The first one is acceptable, while the second one just fizzles. I am sure I can find something better to do with my time than to read the third. The great premise of the series is completely wasted with this author and the main character doesn't seem to be or do anything of consequence, while moving from "captain of the ship" to "social rebel" and "crime investigator" at random.

Wednesday 10 February 2016

I presented an introduction on Reactive Extensions at ADCES

My ugly mug next to my beautiful code On the 9th of February I basically held the same talk I did at Impact Hub, only I did better, and this time presented to the ADCES group. Unbeknownst to me, my colleague there Andrei Rînea also held a similar presentation with the same organization, more than two years ago, and it is quite difficult to assume that I was not inspired by it when one notices how similar they really were :) Anyway, that means there is no way people can say they didn't get it, now! Here is his blog entry about that presentation: Bing it on, Reactive Extensions! – story, code and slides

The code, as well as a RevealJS slideshow that I didn't use the first time, can be found at Github. I also added a Javascript implementation of the same concept, using a Wikipedia service instead - since DictService doesn't support JSON.

Wagakki Band - Akatsuki no Ito

Japanese culture is certainly special. The music, the drawing style, the writing, the cinematography, they are all easily recognizable and usually of high quality. Yet I think it is even cooler when artists are able to blend Japanese feeling with Western cultural artifacts. Check out this Japanese traditional sound... made metal: Akatsuki no Ito (The Thread of Dawn?)

Friday 5 February 2016

Modifying collections from different threads and still binding it via WPF

angry developer This post is discussing the solution to the NotSupportedException "This type of CollectionView does not support changes to its SourceCollection from a thread different from the Dispatcher thread" and also the InvalidOperationException "An ItemsControl is inconsistent with its items source".

In the first case you want to bind in Windows Presentation Foundation a collection property from your viewmodel and it says no. What happens is that you are using a BindingList<T> or an ObservableCollection<T> and the binding system is using in the background a CollectionView that wraps it which does not support changes via multiple threads. The solution to this is rather simple: use this piece of code:
BindingOperations.EnableCollectionSynchronization(collection, lockObject);
This short blog post from Florent Pellet explains things a little, but that is ending on a dire note: The ViewModel becomes dependent on the view It also suggests that you need to create a lock object for each UI thread, if you have more.

This works in .NET 4.5 and that is the reason that when you are looking the exception up you get all kind of answers that either suggest you invoke any changes on the Dispatcher UI thread (which I believe is against the idea of having a viewmodel) or weird bastardizations of the collection classes used, like trying to invoke the events for list changes on the dispatcher of the invoking delegate. I've tried that and I got the second InvalidOperationException exception that I will be covering later on :)

But let's go further and examine what is going on. If you look at the method declaration, EnableCollectionSynchronization also allows specifying a synchronization callback, something that you could use to manage weird custom collection classes. The Remarks sections says When you call this overload of the EnableCollectionSynchronization(IEnumerable, Object) method, the system locks the collection when you access it, which implies you are losing some performance, but not much else. In case you have many parallel threads that are modifying your collection, you need to lock it, anyway. You may, of course, create your own high performance system of changing a collection and, maybe, run a separate method to marshal changes from your private data structure to the UI bound one.

Now, the InvalidOperationException "An ItemsControl is inconsistent with its items source" is thrown when the ItemsSource property has become out of sync with the Items property, which is usually generated by the ItemsControl. So when I tried to create my own badass collection class, I managed to avoid the first exception and I got this one. The same solution applies to both cases:
BindingOperations.EnableCollectionSynchronization(collection, lockObject);
Funny enough, you have to run this piece of code on the Dispatcher UI thread.


But where to use it? It would be rather simple to use it in the viewodel constructor, using the ((ICollection)collection)SyncRoot object, or even in the constructor of a class that inherits from either BindingList or ObservableCollection and has nothing but this type of initialization. I believe that, since this is a binding issues, something within WPF, then the binding system should handle it, like some type of synchronizing Binding. For a second I thought that the IsAsync property of the Binding class would solve this by itself, but it wouldn't work. Also, Binding doesn't have any methods to override and BindingBase is an abstract class with internal methods to implement, which of course doesn't work. Otherwise it would have been OK, I believe, to create a special SynchronizedCollectionBinding class that enables collection synchronization at bind time. BTW, if you are thinking to implement everything starting with MarkupExtension, forget it. The Binding class is a bit hardcoded in Visual Studio and it wouldn't actually work as expected.

That's it, folks!

Wednesday 3 February 2016

I presented an introduction on Reactive Extensions at Impact Hub

Reactive Extensions logo Today I was the third presenter in the ReactiveX in Action event, held at Impact Hub, Bucharest. The presentation did not go as well as planned, but was relatively OK. I have to say that probably, after a while, giving talks to so many people turns from terrifying to exciting and then to addictive. Also, you really learn things better when you are preparing to teach them later, rather than just perusing them.

I will be holding the exact same presentation, hopefully with a better performance, on the 9th of February, at ADCES.

For those interested in what I did, it was a code only demo of a dictionary lookup WPF application written in .NET C#. In the ideal code that you can download from Github, there are three projects that do the exact same thing:
  1. The first project is a "classic" program that follows the requirements.
  2. The second is a Reactive Extensions implementation.
  3. The third is a Reactive Extensions implementation written in the MVVM style.

The application has a text field and a listbox. When changing the text of the field, a web service is called to return a list of all words starting with the typed text and list them in the listbox, on the UI thread. It has to catch exceptions, throttle the input, so that you can write a text and only access the web service when you stop typing, implement a timeout if the call takes too long, make sure that no two subsequent calls are being made with the same text argument, retry three times the network call if it fails for any of the uncaught exceptions. There is a "debug" listbox as well as a button that should also result in a web service query.

Unfortunately, the code that you are downloading is the final version, not the simple one that I am writing live during the presentation. In effect, that means you don't understand the massive size reduction and simplification of the code, because of all the extra debugging code. Join me at the ADCES presentation (and together we can rule the galaxy) for the full demo.

Also, I intend to add something to the demo if I have the time and that is unit testing, showing the power of the scheduler paradigm in Reactive Extensions. Wish me luck!

Monday 1 February 2016

I was at FOSDEM'16

Long story short: I thought FOSDEM 2016 was terribly non-technical.

The entire conference took place at the ULB Solbosch Campus in Brussels, Belgium, which is composed of several buildings in which many rooms are being used for presentations. That meant that not only you had to plan the speeches that you wanted to attend to, but also consider the time it took to move from one building to another (in the cold and rain). Add to this the fact that the space was still insufficient for most talks, and if you didn't get there before the talk started, it wasn't uncommon to find the room full and be turned down at the door for security reasons (meaning fire hazards and the likes, not stupid terrorism). I thought the mobile app FOSDEM Companion was very helpful in keeping track of what is what and where and when.

The talks themselves, though, were mostly 20-25 minutes long. While some reached to 45 minutes, most of them were short presentations of one product or another. Someone would speak in front of a Powerpoint (or some alternative) slide and the most common template was: "I am X I work at Y and we are doing product Z. Here is a history of the product, here is what it can do for you and you can find more at these links." They were open source and free, alright, but other than that it felt like it was a marketing conference, not a technical one. I have seen only one presentation that included actual code.

This doesn't mean I didn't enjoy myself. I've met old friends and some of the presentations were really interesting. I was particularly impressed by something called Ring, which is a completely peer to peer and securely encrypted communication system. Basically it allows you to find people, talk to them (via text, sound, video), while having no central server. It was something that I was looking for and that uses DHT as a discovery mechanism.

So my conclusion is that if you are not there for a specific project or topic, so that you end up finding the people that are interested in the same thing and network with them, FOSDEM is pretty superficial. The talks were recorded and the videos will slowly appear on the FOSDEM video archive site, so actually going there just to see the presentations alone might not be necessary. Being from a slightly different technical domain, I wasn't interested in socialization, and I think that was my biggest mistake.

The people there looked interesting. A friend of mine summarized it well: "one of the few places where there is a queue at the men's bathrooms and not at the women's". There were of course plenty of facially haired, pony-tailed, black leather wearing, Linux laptop carrying hackers running around, but most of the people there didn't look that young or that "hacky". In fact, I think the age average was probably around 40.

That's about it for my FOSDEM report. If you need any more information, leave me a comment and I will fill any holes in the description.

Update:
The talks that I went to and I liked were these:

The Last Policeman, by Ben H. Winters

Book cover This book was recommended by Jeff Atwood, of Coding Horror and Stack Overflow fame. He liked it, I wasn't impressed. In The Last Policeman, Ben H. Winters attempts to describe a world that is powerlessly waiting for the arrival and impact of a 7km wide asteroid. While smaller than the one that killed the dinosaurs off, it would still pretty much end human civilization and most of the life on Earth. As a result, people are killing themselves in depression, quit jobs to "go bucket list", nothing is working, nothing gets repaired, etc. In all of this, a detective is trying to solve a case that looks like just another suicide, but he feels it's not. It is an interesting concept and it was well written by Winters, but I had difficulty believing in the world he was describing. More than that, except rumors of something Iran is planning there is no mention of any other country. I believe that in this situation a lot of people would inertially and routinely continue what they were doing until they figured out that it doesn't make sense (and that probably it didn't make sense to begin with), but there would certainly be more aggressive social changes that the author completely ignores. Plus that "going bucket list" would certainly become frustrating once you can't go on a cruise because no one is sailing the thing or you can't enjoy your favorite food because the restaurant got closed.

Worse than that, at the end of the book I was pretty satisfied with it. It was short, to the point and while not perfect, it was enjoyable. And then I learn that it is part of a trilogy. This automatically diminishes the act of reading it somehow and the book entire by the fact that I can't convince myself to read the other two books. So yeah, bottom line: I thought it was kind of average.