Sunday, November 28, 2010

Goroutines vs function literals (closures)

Goroutines are a kind of heavy way to deal with a situation where you just want some kind of lazy evaluation. Say I would like to process a file line by line and the basic guts of it looks like this with a goroutines:

func lineStreamer (out chan <- string) {
file, err := os.Open("/usr/share/dict/words", os.O_RDONLY, 0666)
if err != nil {
panic("Failed to open file for reading")
defer file.Close()

reader := bufio.NewReader(file)
for {
line, err := reader.ReadString(byte('\n'))
if err != nil {
// Do something interesting here perhaps other than returning a line
out <- line

This greatly simplifies the act of opening the file, dealing with bufio, and gives me an interface I can just read lines from (or processed lines from) on a channel. But it seems kind of slow, running at about 2.04 to 2.07 seconds on my macbook pro with no runtime tuning. If I raise GOMAXPROCS to 2 I'm getting between 1.836 seconds to 1.929 seconds. GOMAXPROCS at 3 is getting me fairly regular 1.83 seconds.

This got me thinking about how I'd so something like this in other languages. I don't think I'd need coroutines to do it in Scheme for example, as I could do some delay/force thing to get stuff evaluated in chunks.

This led me to the following, possibly non-idiomatic version of a Go program using function literals.

type Cont func()(string, os.Error, Cont) // string, error, and a function returning the next

func lineStreamer (file *os.File, reader *bufio.Reader) (string, os.Error, Cont) {
line, error := reader.ReadString(byte('\n'))
return line, error, func () (string, os.Error, Cont){
return lineStreamer(file, reader)

To evaluate all the lines I can do something like the following:

s,err,next := lineStreamer(file, reader)

for err == nil{
fmt.Printf("%s", s)
s,err,next = next()

And my run times are down to about 1.2 seconds.

I guess my question is, is this idiomatic or not.