Archive

Posts Tagged ‘brmlab’

Brmson / BlanQA

January 27th, 2014 No comments

I have recently been dabbling in Natural Language Processing, in particular Question Answering. I have been fascinated by the success of IBM Watson and have gradually came to believe that this technology can serve as a great basis of autonomous agents operating in the complex world of human knowledge. (I later came across Project Aristo – I’m not alone.) This approach, compared to projects like OpenCog that aim to create autonomous agents understanding and operating in the physical world, seems to offer many advantages – but let’s talk about that some other time.

Let’s say we wanted to take a stab on approximating IBM Watson with easily available technology, in “at home” conditions (or rather, “at hackerspace” – I gave this aim a temporary callsign “Project Brmson”). What’s the best we can do?

So I took a look at the current open source question-answering technologies and found – well, just one, and none that would be immediately usable by anyone. I have put together a short survey of the current landscape.

The only OSS framework I found that (i) could be used with not-so-many modifications to produce something functional, and (ii) would be a good base to build a truly good system on, is OAQA / OpenQA. It seems appealing from multiple viewpoints – it builds on the UIMA unstructured data processing platform which is also at the basis of IBM Watson, it originates at CMU which collaborated with IBM in this area; and, well, it’s the only platform that already exists anyway, so it’s a good starting point for someone who has no prior clue about the field. A honorable mention goes to OpenEphyra, basically a non-UIMA OAQA predecessor by the same institution; it’s not a good base to use for new systems, but can be sourced for a lot of NLP functionality.

In my first stab, I looked if there is actually a working QA system built on top of OAQA, and the answer was non-obvious. There is a helloqa project, but its master branch can currently do nothing useful. However, there is also a prototype branch that can actually answer some terrorism-related questions! It doesn’t work out of the box, but our fork does if you follow the instructions. But overally the project seems to be a bit of a hack and not a good base for a universal system usable by anyone but the original author.


So I set out to rewrite the helloqa-prototype from scratch on top of OAQA and build a different, clean and extendable QA pipeline (that shares bits of the original code and is much simpler). Thus, behold the project BlanQA! :-)

BlanQA is focused on universality, practicality and user-friendliness. That means there is a relatively detailed documentation and easy to follow installation instructions (try BlanQA out yourself!). By default, BlanQA offers interactive mode and will answer on top of Project Gutenberg corpus; but you can also connect it to IRC (#brmson @ freenode) or run on top of Wikipedia.

BlanQA is still a very stupid program at this point. It gets the answer right about 10-30% of the time, depending on how nicely you ask. But it’s more important as a base on top of which you can add clever algorithms (the smartest parts of BlanQA are currently outsourced from the OpenEphyra project, mainly guessing the type of the answer – is it a person? location? amount of something?). And if you want an OSS question-answering engine now, BlanQA is where to turn!


I want to develop this further, but the way ahead remains a little unclear. The thing is, OAQA appears to have significant architectural problems, as I realized while I continued hacking BlanQA and learning more about both OAQA and the UIMA framework it builds on top of. The rest of this section is a bit technical, c.f. also a quick intro to BlanQA architecture.

The basic UIMA principle is that each artifact (in this case: question, document/passage, answer) should have its own CAS (“piece of data” with a set of annotations and other featuresets derived from it) with a dedicated type system and appropriate Sofa (view of this piece of data). This would enable easy creation of stand-off annotations of e.g. fetched documents.

However, the OAQA model works with just a single CAS that has just the question text set as a Sofa and then a variety of types mashed together, partitioned only into phase-based views. This seems to me as a substantially less appealing option – it doesn’t allow to use third-party UIMA annotators that expect their subject to be the Sofa, it might be harmful for scaleout and it seems generally awkward to use; I actually have hard time seeing what advantages does using UIMA bring on the table in this model.

So it seems the way forward for BlanQA (or likely a differently-named successor) is to break away of OAQA and build directly on top of UIMA (possibly with a hacked version of uima-ecd that supports multiple CAS, but that seems as a bit intimidating proposition).


Tue Jan 28 2014 update: Note that we have started work on a new Question Answering engine YodaQA built on UIMA from scratch.

Categories: software Tags: , , , , , ,

Texas Instrument Launchpad MSP430 and Linux II

June 13th, 2012 4 comments

So, thanks to very helpful Rickta59 on #43oh IRC channel, I got my Launchpad v1.5 serial communication working. The key piece of information I was missing:

If you are using hardware UART,
you must rotate the RX-TX jumpers by 90 degrees!

This is even drawn on the board, but it just didn’t occur to me that I need to do this simple thing. Most examples seem to use hardware UART, and Energia Serial class also uses hardware UART.

It is still very flaky:

  • For the first ten seconds, communication is impossible. Wait for timeout messages to appear in dmesg, then you can start communication.
  • When the board is sending data, something must be reading them on the host side. If not, the driver collapses and you need to replug the device.
  • The latter might be circumvented by direct USB communication without involving the tty driver.

So, it is rather fragile, but usable! Let’s enjoy our Launchpads for projects where this is not a big issue…

Texas Instrument Launchpad MSP430 and Linux

June 11th, 2012 2 comments

I found out that the situation with MSP430 is not as bad as it seemed. This post is mostly obsolete, but I’m leaving the text up for the benefit of Google index and other desperate people struggling with their Launchpad. :-)

This blogpost serves as a big fat warning to the future ones that might be about to follow in my footsteps:

Currently sold TI Launchpad MSP430
is not properly supported by Linux
as of 2012-06-01

It’s a sad reality but that’s just how it is, to the best of my knowledge, and after a lot of research and doing unbelievable things to kernel drivers etc. To clarify a bit, basic programming using mspdebug works, but you cannot communicate between host and board using USB serial. This seems to have worked with much older USB chips but not with the ones used by TI in current versions of the board (I got Launchpad with MSP-EXP430G2 ordered in May 2012).


Some fun technical details to help google index and guide others diagnosing this:

[186808.775510] usb 1-1.2: new full-speed USB device number 7 using ehci_hcd
[186808.891778] usb 1-1.2: New USB device found, idVendor=0451, idProduct=f432
[186808.891788] usb 1-1.2: New USB device strings: Mfr=1, Product=2, SerialNumber=3
[186808.891794] usb 1-1.2: Product: Texas Instruments MSP-FET430UIF
[186808.891800] usb 1-1.2: Manufacturer: Texas Instruments
[186808.891804] usb 1-1.2: SerialNumber: CFFF4695F6C11445
[186808.924900] cdc_acm 1-1.2:1.0: This device cannot do calls on its own. It is not a modem.
[186808.924914] cdc_acm 1-1.2:1.0: No union descriptor, testing for castrated device
[186808.925029] cdc_acm 1-1.2:1.0: ttyACM0: USB ACM device
[186808.927595] usbcore: registered new interface driver cdc_acm
[186808.927603] cdc_acm: USB Abstract Control Model driver for USB modems and ISDN adapters
[186818.963279] generic-usb 0003:0451:F432.0001: usb_submit_urb(ctrl) failed
[186818.963332] generic-usb 0003:0451:F432.0001: timeout initializing reports
[186818.964177] generic-usb 0003:0451:F432.0001: hiddev0,hidraw0: USB HID v1.01 Device [Texas Instruments Texas Instruments MSP-FET430UIF] on usb-0000
:00:1a.0-1.2/input1
[186818.964262] usbcore: registered new interface driver usbhid
[186818.964269] usbhid: USB HID core driver

This is what my dmesg says the first time the board is plugged in. mspdebug works fine but any attempt of serial communication over /dev/ttyACM0 (talking to TI-provided sample UART code). OBTW if you are actually wondering how to compile and upload stuff on this baby:

msp430-gcc -mmcu=msp430g2553 -Wall -O3 -o uart_01_9600 msp430g2xx3_uscia0_uart_01_9600.c
mspdebug rf2500 prog\ uart_01_9600

For USB interface, TI includes its own crazy USB-enabled microcontroller on board that provides a HID-ish interface (for mspdebug) and an ACM-ish interface (for UART emulation) on a single port (which is nicely confusing). The serial part is supposed to be handled by ti_usb_3410_5052 kernel driver, which grabs a firmware and attempts to reflash the USB microcontroller so that it presents a more sensible serial USB interface (pretty crazy, eh?). However, the rf2500 variant of this chip appears to be too new and simply not supported either by the firmware or the firmware uploader.

Tweaking USB ids in the driver (f430 -> f432) does not help. Getting ti_3410.fw that Debian helpfully does not ship does not help. Manually binding the driver to USB does not help. The furthest I get is that the driver indeed tries to flash the ti_3410.fw firmware to device, but just times out doing that (I think maybe I bricked the serial part of the USB microcontroller by now):

[193053.430662] ti_usb_3410_5052 1-1.2:1.0: TI USB 3410 1 port adapter converter detected
[193054.443490] usb 1-1.2: ti_download_firmware - error downloading firmware, -110
[193054.443528] ti_usb_3410_5052: probe of 1-1.2:1.0 failed with error -5

Oh, and mspdebug rf2400 exit before any serial communication (I have found a tip somewhere) does not help either. An obviously-working UART code for MSP430G2553 would be welcome too, to triple-rule-out a uC-side firmware problem. (The launchpad board is awesome but rx/tx leds are sorely missing. I know, I could grab an oscilloscope… but how many hours have I already wasted by this?)


So, what seemed to be a great Arduino replacement turns to dust for me since the whole point of 80% of my Arduino projects is to talk to a computer… That said, if (after) you make it work, you will get one, or maybe even two Launchpads for free from me.

Realtime Signal Analysis in Perl

September 24th, 2011 No comments

About a month ago, we were working on the Fluffy Ball project – a computer input device that can react to fondling and punching. Thanks to a nice idea on the brmlab mailing list, we use a microphone and process the noise coming from the ball’s scratchy stuffing and an embedded jingle. The sounds from the outside are almost entirely dampened by the stuffing and for a human, the noise of fondling and punching is easily distinguishable.

Frequency spectrum, for our purposes, is just an array indexed by frequency, storing the amplitude of each frequency (in some range). A common variation is the power spectrum that describes the power of each frequency, i.e. the amplitude squared. Frequency spectrum is obtained by splitting the input signal to fixed-size samples and performing Discrete-Time Fourier Transform.

It turns out that trivial spectrum-based rules can be used to achieve reasonably high detection accuracy for a computer too (especially when the user is allowed to “train” her input based on feedback); I had big plans to use ANN and all the nifty things I have learned in our AI classes, but it turned out to be simply an overkill. The input signal is transformed to a frequency spectrum (see box) using real discrete FFT.

So, we have the audio signal coming in from a regular mic device and need to process it further. I chose Perl for quick prototyping and I have assumed that I would find some pre-made scaffolding for this ready. But it turns out that noone really published a simple example of even just showing a real-time frequency spectrum. So, here you go! :-)

First, we need some reasonable way to continuously display the spectrum. Most GUI paradigms are event-driven, but events are usually user interaction pieces and while it would be possible to incorporate continuous data-based updates in this model, it feels quite backwards. So we use a trick:

use warnings;
use strict;
 
init_dsp(); init_fft();
 
use Tk;
our $mw = MainWindow->new;
$mw->after(1, \&ticks); # after 1ms, give control back
MainLoop;
 
sub ticks {
	while (1) {
		render_signal(process_signal(read_dsp()));
		$mw->idletasks();
	}
}

This circumvents the event-driven architecture of Tk and instead puts our main loop in control, processing any GUI events when it’s good time. For more complex programs, this is a bad idea and it will lead to poorly maintaineable code, but when writing simple tools, you should not succumb to grand frameworks and let your code overgrow you.

Okay, how to grab audio input signal in Perl? Unfortunately, there are not really any handy modules you could use thoughtlessly. Audio::DSP is a possibility, but using it is clumsy, especially in the current world of ALSA as you have to rely on the imperfect aoss wrapper. A simple alternative is to get the raw byte data through a pipeline from the ALSA arecord tool:

our ($devname, $fmt, $bitrate, $wps, $bps, $bufsize, $dsp);
BEGIN {
	$devname = "default"; # or e.g. hw:1,0 for an additional USB soundcard input
	$fmt = 16;            # sample format (bits per sample)
	$bitrate = 16384;     # sample rate (number of samples per second)
	$wps = 8;             # FFT windows per second (rate of FFT updates)
	$bps = ($fmt * $bitrate) / 8; # bytes per second
	$bufsize = $bps / $wps;   # window buffer size in bytes
}
 
sub init_dsp {
	open ($dsp, '-|', 'arecord', '-D', $devname, '-t', 'raw',
		'-r', $bitrate, '-f', 'S'.$fmt) or die "arecord: $!";
	use IO::Handle;
	$dsp->autoflush(1);
}
 
sub read_dsp {
	my $w;
	read $dsp, $w, $bufsize or die "read: $!";
	return $w;
}

read_dsp will return one signal window per call, the window being a binary blob consisting of one two-byte word per sample. We want to magically convert this to a spectrogram.

Audio::Analyze is again the simple way to get a signal spectrum. If you are after analyzing a pure audio signal, you probably want to use it since it can easily filter the signal based on relative human perception of frequencies etc. But for us, it is inconvenient to feed it data through a pipe and we will directly use Math::FFT. It will still handle all the gory math for our case (and we care about the actual noise, not the way people would hear it).

use Math::FFT;
use List::Util qw(sum);
 
our @freqs;
sub init_fft {
	my $dft_size = $bitrate / $wps;
	for (my $i = 0; $i < $dft_size / 2; $i++) {
		$freqs[$i] = $i / $dft_size * $bitrate;
	}
}
 
sub process_signal {
	my ($bytes) = @_;
 
	# Convert raw bytes to a list of numerical values.
	$fmt == 16 or die "unsupported $fmt bits per sample\n";
	my @samples;
	while (length($bytes) > 0) {
		my $sample = unpack('s<', substr($bytes, 0, 2, ''));
		push(@samples, $sample);
	}
 
	# Perform RDFT
	my $fft = Math::FFT->new(\@samples);
	my $coeff = $fft->rdft;
 
	# The output are complex numbers describing the exactly phased
	# sin/cos waves. By taking an abs value of the complex numbers,
	# we just measure the amplitude of a wave for each frequency.
	my @mag;
	$mag[0] = sqrt($coeff->[0]**2);
	for (my $k = 1; $k < @$coeff / 2; $k++) {
		$mag[$k] = sqrt(($coeff->[$k * 2] ** 2)
		                + ($coeff->[$k * 2 + 1] ** 2));
	}
 
	# Rescale to 0..1. Many fancy strategies are possible, this is
	# extremely silly.
	my $avgmag = sum (@mag) / @mag;
	@mag = map { $_ / $avgmag * 0.3 } @mag;
	return @mag;
}

Not much to add besides the inline comments. The input of the process_signal function is a raw byte stream, the output is a list of amplitudes; @freqs maps the list indices to the actual Hz frequencies. The normalization to [0,1] interval shown here (pitching the mean at 0.3) is extremely naive, again there are many possible strategies. Also, you certainly want to use a window function etc. in more serious applications.

Now, for the visualization. We have chosen Tk for our GUI (it looks ugly, but it is reasonably easy to use despite its Tcl antics). We will use its Canvas object where we can draw freely, and just plot a line for each frequency:

our $canvas;
sub render_signal {
	# Display parameters, tweak to taste:
	my $rows = 2;
	my $hspace = 20;
	my $height = 150;
	my $vspace = 20;
 
	my @spectrum = @_;
	my $row_freqn = @spectrum / $rows;
 
	$canvas;
	unless ($canvas) {
		$canvas = $mw->Canvas(
			-width => $row_freqn + $hspace * 2,
			-height => $height * $rows + $vspace * ($rows + 1));
		$canvas->pack;
	}
	$canvas->delete('all');
 
	for my $y (0..($rows-1)) {
		for my $x (0..($row_freqn-1)) {
			my $hb = ($height + $vspace) * ($y + 1);
			my $i = $row_freqn * $y + $x;
 
			# Draw line:
			my $ampl = $spectrum[$i];
			$ampl <= 1.0 or $ampl = 1.0;
			my $bar = $height * $ampl;
			$canvas->createLine($x + $hspace, $hb,
			                    $x + $hspace, $hb - $bar);
 
			# Draw label:
			if (!($x % ($row_freqn/4))) {
				$canvas->createLine($x + $hspace, $hb + 0,
				                    $x + $hspace, $hb + 5,
				                    -fill => 'blue');
				$canvas->createText($x + $hspace, $hb + 15,
				                    -fill => 'blue',
				                    -font => 'small',
				                    -text => $freqs[$i]);
			}
		}
	}
 
	$mw->update();
}

This suffices for a naive visualization, you can easily tweak it to do thresholding and whatever else you desire. I have found that on some of my computers, the X protocol is pushed to its limits by repeatedly drawing a large amount of lines, and sometimes the spectrum will start to lag behind the signal; either show wider bars averaging together multiple frequencies, or use something other than a Canvas object – raw pixmap transfer would likely be better than such a large amount of line drawing operations.

For a serious signal analysis work, you will also want a spectrogram – a time-based plot of amplitude of various frequencies.

To get a working script skeleton, simply piece the code snippets together (fb-simple.pl). See fb.pl for the real fluffy ball script. It is much uglier, but it maintains sample averages over longer time windows (essential for more complex signal analysis), it has simple sample recording capabilities, and an example of naive threshold-based classifier.

Categories: software Tags: , , , ,

brmd: A Case for POE

May 26th, 2011 No comments

In brmlab, we want to track who is unlocking the space, whether someone is inside, have some good visual indicator that live stream is on air, and so on. In other words, we have an Arduino with some further hardware, and we want to show whatever is reported by the Arduino on IRC and web, and provide some web-based control (open/closed status override) in the opposite direction too.

What to use for a service (we call it brmd) that will bind all these interfaces together? It just needs a lot of boring frontends and simple state maintenance. It turns out that Perl’s POE framework is ideal for this – most of the code for IRC, HTTP and device read/write is already there, so you just grab the modules, slam them together and you have exactly what you need with minimal effort. Right?

It turns out that there are caveats – basically, the idea is correct, aside of getting stuck on a single stupidity of mine, I’d have the whole thing put together in something like half an hour. Unfortunately, the moment you want robustness too, things are getting a lot more complex; to handle the device disappearing, IRC disconnections, not having HTTP socket fds leak away, etc., you suddenly need to either magically know what further modules to hook up or start exeting some manual effort. Still, I like how POE is making it so easy to give a simple state machine many input/output interfaces and when you get used to the idiosyncracies, you can even make it somewhat reliable.

Example POE code

While this task seems to be ideal fit for POE, I’ve found surprisingly few examples of more complex POE component interaction on the web. Therefore, I’m going to lay out at least tersed up version of brmd below to help fellow future googlers. Nothing of this is anything ground-breaking, but it should help a lot to have a template to go by. Our real version is somewhat more verbose and includes some more interfaces: brmdoor.git:brmd/brmd.pl

I assume that you already know what “POE kernel” and “POE session” is. Beware that I’m a POE beginner myself, and I haven’t gone through much effort to clean the code up the way I would if I were to work together with someone else on this. Some things surely aren’t solved optimally and you might even pick up a bad habit or two.

In order to have some neat separation, we will divide brmd to several components where each will take care of a single interface; the main package will only spawn them up and do some most basic coordination. If we were to grow much larger, it would be worth the effort to even set up some kind of message bus (I wish POE would provide that implicitly), here we just directly signal components needing the info.

Read more…

Categories: software Tags: , , ,

Cute cuddly robot!

May 6th, 2011 No comments

In the past few months, I have been playing a bit with a great robotic platform available at the university in the Introduction to mobile robotics and Eurobot subjects.

We were provided a pre-made robot chassis with some basic electronics, an ATMega128 board, hefty battery, motors and Sabretooth motor drivers. With AxTheB, we have built a merkur-based construction on top of it to hold a camera-on-stick module that gives a picture of surroundings of the robot, and a 12″ notebook that is hooked up to the webcam and the ATMega.

The most interesting thing is the camera. It is held up on a wooden stick, facing upwards to a parabolic mirror (i.e. a laddle), giving it picture of its surroundings in about 320\deg angle (part of the view is obstructed by the stick). That’s not my original idea but it was originally suggested and built for brmbot outdoor. We had it for Robotour 2010 competition, and during the competition I have even built some basic image recognition for it, but in the end we did not have time to integrate it to the main control software so the camera served only as a holder for GPS+compass back then.

More about the tasks below. In the end it turned out that I really don’t have enough time to do this so things got quite stressful at one point, but I would feel really bad if I gave up. In the end, I managed to get things working. And as any robot builder will tell you, seeing your tiny friend roam around happily, doing whatever it’s supposed to do, is worth any stress! :-)

The source code is rather horrible. Keep in mind that it was hacked incrementally and never really cleaned up.

Brmpuk

The first task (Puck Collect from Robot Challenge 2011 in Vienna): Your tiny robot is in a ~2x2m white playground, with two corner squares painted red and blue, and with tiny red and blue pucks scattered over the playground. Its robotic opponent also sits in the playground. Each player has a color assigned, and its goal is to accumulate as many pucks of its color in its corner as possible, while avoiding putting pucks of the other color in that corner. And a deadline of two minutes.

(In addition, the robot has got also a sort of plough in the front
where it collects and pushes the pucks as few centimeters directly
in front of the robot are invisible for the camera.)

Unfortunately, I was not actually able to spend a lot of time on the project (and attend the awesome lectures) due to sickness (to play with robots, you have to come to the lab) and time scheduling problems. Nevertheless, I have been able to finish at least a basic version of the robot and it kind of did what it was supposed to do. (Though for a real competition, more work would be needed – the construction was very frail and the bot could not properly unstuck itself when it got misled e.g. by colors seen outside of the playground.)

Source code: brmpuk.git

Blackline

The second task was a lot simpler – just follow a thick black line laid on the floor. With the communications, firmware and video processing infrastructure already debugged, it was just a matter of replacing the image recognition and I managed to get everything done and debugged in just a couple of hours.

To briefly describe the workings of the robot: The camera provides a picture of the neighborhood of the robot in the YUYV format. The software periodically grabs frames from the camera and looks at three rectangles covering the area slightly in front of the robot – straight ahead, to the right and to the left. (There is a slight gap to give the robot a chance to react with sufficient head-start, deal well with sharp turns and skip over gaps.)

The contents of the three rectangles is then analyzed; YUYV is a convenient pixel format for image recognition since you already have brightness (luma) and hue (chrominance) separated. To detect dark line on light background, it is enough to look at the Y-values of pixels. Second, we are detecting a high-contrast object that is always smaller than the rectangle we look at. So we do a trivial thing – get luma difference of the darkest and brightest pixel. We do not get dark pixels from dirt since the camera image is sufficiently blury, and the only large enough dark object on white background is the line, so this works perfectly, can adjust to overall brightness of the image (to a degree), and is really simple and foolproof.

The robot driving strategy is then trivial. After each frame, the control program will give an update on new speed of both wheels. If it detects the line is being followed, it goes straight ahead. Otherwise, if one of the side rectangles is active, it stops one motor and starts turning in that direction. If no rectangle is active, it may be that a sharp turn has been encountered: at one point both the ahead rectangle and a side rectangle was active, at the next moment none was. Therefore, it looks at the last time one of the side rectangles has been active and goes in that direction. A similar handling is used for both side rectangles active (during turning to one side, the other rectangle may blink to activity e.g. due to table edge).

Source code: brmpuk.git blackline


P.S.: I just wasted three hours trying to work around bugs in WordPress, jQuery(?) and YouTube/Chromium so that I could publish this post. How are you people managing to live in this Web 2.0 world?!

Categories: hardware, software Tags: , ,

Datasheet Translator

December 28th, 2010 No comments

As we (well, mostly other people than me) were dealing with a rather obscure micro-controller when hacking our laser projector in brmlab, the only datasheet we have found has been in Chinese. This is quite often the case with obscure China-made parts (including event stuff like LEDs) and it’s annoying to deal with.

So I hacked together a simple datasheet translator – you feed it a PDF with your datasheet, specify the source language, let it munch away for a minute and then it spews out a link to the English translation!

datasheet-en.or.cz
Example: http://tinyurl.com/2bbq6o7

Its user interface is extremely rudimentary, if someone wants to add an AJAXy progressbar and what-not, just let me know. :-)

The “technology” is not much to mention either – thankfully, pdftohtml can do quite nifty stuff nowadays (just needs a lot of beating to properly zoom the documents), and Google Translate can do awesome job with technical documents.

Categories: hardware, linux, software Tags: ,

quick update

December 28th, 2010 No comments

Couple of things to blog about accumulated again, so I guess I will be posting bunch of articles soon. For such a long gap, a general update is in order, I suppose.

Aside of my studies, continuous MCTS (Go) research, grinding away on random glibc bugs and such, I most notably got involved in brmlab this summer – the Prague hackerspace. We got a pretty thriving community there now, with various cool events, a lot of great projects (I’m involved in quite a few), and so on!

Well, I wanted to mention more, but I can’t remember it right now. I suppose it will just come together. Oh, and seeing my last post – I have a new bike, this time with insurance. ;-)

Categories: life Tags: , ,