[Last update: 2009-10-18T19:58:33Z] [5 articles in total]

Review of Melanie Mitchell's excellent book

So today is Ada Lovelace Day, and the challenge has been thrown down to write about women in technology. Now there are many obvious candidates (Grace Hopper for one), but I decided to write instead a review of an excellent computer science book that just happens to have a female author. I originally bought this book from a second-hand bookshop for five pounds. I liked it so much that when I lost my original copy I bought it again at full price.

I have been interested for years in GAs and have read fairly widely. Koza, Holland etc etc etc. Many of those books adopt either a sort of messianic tone, implying that this one technique will solve all the problems of the world, or the typical academic approach of attempting to make your work sound as difficult and important as possible while simultaneously undermining all the work of your peers in the field.

Mitchell's book could not be more different. It explains the basic approaches clearly, and in enough detail to inspire further study. She has an excellent grasp of the pros and cons of the approach and it's promise, and it's mercifully short. An excellent read and real kudos to Mitchell for an approachable and readable introduction to the field.

I thought I would never find a keyboard that I liked better than Typematrix, but I may have to eat my words...

Until yesterday, I used to use a typematrix dvorak keyboard. Typematrix is a fabulous keyboard with lots of terrific design features but its primary strength is that it has a small desktop footprint, and that it uses straight forward and back movements of the fingers without any of the small left and right movements that are required due to the staggered layouts of conventional keyboards. This is a massive improvement if you do a lot of typing or if you have problems with your hands and wrists.

The one weakness of the typematrix in my opinion is that the modifier keys (ctrl and alt) are not placed in a very convenient spot. This used not to bother me as I am a Vim user and so didn't require lots of control keystrokes. However, recently I have been doing almost all of my programming in a proprietary in-house environment and this has been causing my pinkies to take more and more of the strain. I finally decided to make the switch when I had a marathon coding day where I wrote something like 600 lines of very formulaic code in a day and my fingers were aching from the funny contortions I had been putting them through with all of the control keystrokes that had been required.

I briefly tried out one of my colleagues kinesis "Advantage" keyboards and immediately decided that it was weird enough that I should buy one. I opted for a model that has an inbuilt hardware dvorak mode. I prefer this so that I am able to use dvorak mode right from the login, rather than switching in user preferences or dotfiles.

So, the kinesis keyboard feels like a substantial unit. When you are typing on it, it has a pleasing clackiness, and the general functionality is good. It has a very cool hardware remapping thing that means you can move the keys you don't like around to your heart's content. It's a really nice keyboard.

A review of "The Crying of Lot 49" by Thomas Pynchon

About a year ago, when I had just just read and enjoyed Stone Junction: An Alchemical Potboiler by Jim Dodge, a friend who has very similar taste in novels to me recommended that I read some Thomas Pynchon. This surprised me for two reasons. Firstly, I had (I am now ashamed to admit) always assumed without evidence that people who read Pynchon (and by extension, the man himself) were hopeless pseuds and that the experience of reading his work would thus be a disappointing one. Secondly, the only thing I had really heard about Pynchon was the supposedly fiendish complexity of his style and hearing that his books might be similar to the easy, fun narrative riffs of Jim Dodge seemed perculiar. In actual practise, I would say on refection that "The Crying of Lot 49" and "Vineland" are more like Jim Dodge, and "V" and "Gravity's Rainbow" are altogether darker and more troublesome. "Slow Learner" and "Mason and Dixon" form the final category, namely "Pynchon I haven't read yet".

So I embarked (with my trepidation tempered somewhat by the shortness of the novel) on "The Crying of Lot 49", and found (of course) that I was wrong and that Pynchon is exactly my sort of author. Funny, thoughtful and compassionate, he is capable of a staggering range and has an incredible dexterity with language and plot. His books are rich in allusion, but wear their erudition lightly, so where I was able to spot and decipher additional subtexts, they added something, but I never felt as if I was being presented merely with an exposition of the author's cleverness and research. If that makes me a pseud, then I am proud to be so.

I would strongly recommend that anyone who is the least bit curious about Pynchon start reading with "The Crying of Lot 49". It's tremendous fun to read and you'll never feel quite the same about waste bins again.

A review of "The Undercover Economist" by Tim Harford

Inspired by my friend Erich Schlaikjer, I intend to write some reviews of books and music and put them up here, so I'll start with the book I have just finished, which is "The Undercover Economist" by Tim Harford.

This book is an entertaining and thought-provoking exposition of some key principles of economics and how they interact with big questions (such as "Why are rich companies rich and poor companies poor?"), small questions (such as "Why does my morning coffee cost as much as it does?") and all sizes in between. In that respect, it grows out of the author's "Dear Economist" column in the weekend ft, in which he offers often amusing "agony-aunt" style advice using the principles of economics to shine a light into normal everyday activities and moral dilemmas. The same mix of clear, reasoned argument and amusing and thought-provoking style is in evidence in this book.

All of the economics is explained and argued in a way that someone (such as myself) who has interest but little understanding in this field can easily follow, and there are references at the back to academic papers and journalism that back up the author's points and provide a starting-point for additional reading.

As another entry in the growing field of "popular economics" books, and with it's ostensibly similar promise of using economics to shine a new light onto everyday things this book is bound to provoke comparisons with "Freakonomics", but both books are worth reading in their own right and there is very little that is covered in both. Actually I enjoyed "The Undercover Economist" more, not least because it lacked the selfcongratulatory sections of Freakonomics that annoyed me so. (That book has sections that are just one of the authors telling you how clever and amazing the other author is- I don't want someone to tell me they are clever, I want them to show me!). It is also more deeply thought-provoking for being less willfully controversial.

Taking advice from a real expert, the pricer becomes more idiomatic

In my quest for a good ocaml book, I found and bought this one: "OCaml for Scientists" by Jon Harrop. I really can't recommend it too highly- it's an excellent book. Furthermore, Jon Harrop looked at this website and gave me some advice about making the code more idiomatic by using record types. He even wrote the first one for me to show how!

The basic idea is that you don't need classes to encapsulate data (as I have been using them), you can just use bindings which become enclosed within functions in a record type. This is very nice indeed.

```
(** parameter for models *)
type param = {
integral : float -> float -> float;
integral_sq : float -> float -> float;
mean : float -> float -> float;
rms : float -> float -> float;
}
(** Constructor function that makes a parameter record given an
integral and an integral_sq func *)
let make_param integral integral_sq =
{
integral=integral;
integral_sq=integral_sq;
mean=(fun t1 t2->(integral t1 t2) /. (t2 -. t1));
rms=(fun t1 t2->(integral_sq t1 t2) /. (t2 -. t1));
};;
(** build a constant parameter *)
let param_const x =
let integral t1 t2 = (t2-.t1) *. x in
let x_sq = x *. x in
let integral_sq t1 t2 = (t2-.t1) *. x_sq in
make_param integral integral_sq;;
(** Parameter that models a linear function f(x) -> ax + b *)
let param_linear a b =
let integral t1 t2 =
let integrate_to x = 0.5 *. x *. a *. x +. x *. b in
integrate_to t2 -. integrate_to t1
in
let integral_sq t1 t2 =
a ** 2. *. (t2 ** 3. -. t1 ** 3.) +.
a *. b *. (t2 ** 2. -. t1 ** 2.) +.
b ** 2. *. (t2 -. t1)
in
make_param integral integral_sq;;
```

So our classes have become a simple record of four functions and some functions which return this type. Notice how make_param takes two functions and makes two more from them. The code is otherwise similar to the class-based examples, but is a little more straightforward. For piecewise parameters, we do the same sort of thing:

```
(** For piecewise constants and piecewise linear functions a "piece" is a
parameter that applies between low and high limits. *)
type piece = { low: float; high: float; p: param; };;
(** Make piecewise parameters. Takes a list of pieces which each
specify the limits and the linear or const parameter in each region.
At present, does no checking that the pieces are continuous and not
overlapping *)
let param_piecewise pieces =
(* there's probably a better way of doing this, but anyway...
Apply a function to each piece, with the parameters being the part of
the interval from t1 to t2 that is inside the interval of the piece *)
let visit_pieces pieces fn t1 t2 =
let visit_piece piece low high =
if (low > t2 || high < t1) then
0.
else
let range_start = max t1 low in
let range_end = min t2 high in
(fn piece) range_start range_end in
let rec visit_list so_far lis =
match lis with
[] -> so_far (** we're done *)
| {low=low; high=high; p=p} :: rest -> visit_list (so_far +. (visit_piece p low high)) rest
in
visit_list 0. pieces
in
let integral t1 t2 = visit_pieces pieces (fun x->x.integral) t1 t2 in
let integral_sq t1 t2 = visit_pieces pieces (fun x->x.integral_sq) t1 t2
in
make_param integral integral_sq;;
(** helper func to make parts of a piecewise const function *)
let make_const_piece low high x =
assert (low<high);
{low=low; high=high; p=(param_const x)};;
(** helper func to make parts of a piecewise linear function *)
let make_linear_piece low high a b =
assert (low<high);
{low=low; high=high; p=(param_linear a b)};;
```

This really is a great deal nicer than what we had before. Flushed by our success, we make a simple function to turn a list of x,y tuples into a piecewise constant function:

```
let make_const_param_curve lis =
let rec build_curve (so_far : piece list) x_old lis =
match lis with
[] -> List.rev so_far
| (x, y)::res when x>x_old ->
let this_piece = make_const_piece x_old x y in
build_curve (this_piece::so_far) x res
| (x, y)::res -> invalid_arg "make_const_param_curve: expects sorted x in list"
in
build_curve [] 0. lis;;
```

Now we can build a piecewise const function by doing:

```
make_const_param_curve [ 0.1, 0.5; 0.5, 0.25; 1., 0.2 ];;
```

It would be a simple matter to make a function that built a piecewise linear function up in the same manner.

Unless otherwise specified the contents of this page are copyright © 2015 Sean Hunter. This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.