Immediate, unfiltered thoughts on Go, after drinking buckets of Haskell.

Overall insight: Go enforces such strong patterns, some of which that don’t make any sense, that at times it widens the gap between what you are trying to do, and the demands of the language to express it in a very specific way.

Types but not really?

Unlike Haskell, go’s source of truth is not its types. Types are looser in structure, but how “forgiving” they are makes me feel as though I have to do more work. Types don’t compose (or maybe only offer first-order composition?) so mapping is just ad hoc polymorphism—every type providing its own map function.

Capitalization rules are frustrating

The only thing upper- or lower-case names communicate is public/private visibility. This makes other, in my opinion, higher value information ambiguous since types and functions can look the same (they can be public functions or types) and functions and packages also look the same (they can be private functions or packages).

Error checking tedium is annoying

But! I’m unclear on the types of initialization + assignments that require an error (ex., res, err := vs. res :=) — does this just depend on the return types of the outer function these constructs are in service to? I feel like you can be more nuanced in the types of checks you enforce without shoving an error check everywhere. It adds code bloat, makes things less readable, and forces an error-centric view of the world on your logic.

Uber imperative

Everything follows a similar pattern: looping over some input, performing some computation on each element, outputting the result… This makes it easy to read through code in some ways (you expect less variability, which reduces cognitive overhead); but enforcing such strong patterns, it blurs out other salient details. This, in turn, can increase cognitive load because debugging can be difficult when everything looks the same.


Overloaded functions (defined by different packages) are very common though, so I guess using package.Function or package.function is a common pattern. The idea of a “package” is scoped to a file, but then there’s also “pkg” which is a bit confusing?

I don’t like import syntax

They look like strings.

Zero values

I like that “zero values” inform errors, but it’s sometimes difficult to write abstractions that generalize error handling across a variety of types where zero values may be different (without lots of explicit casting).

Tooling is DOPE

That is all.

Tooling issue

Disposing unused imports on save is cool, but sometimes when I’m starting off a file, it causes the code I haven’t written yet to get deleted.


I wonder if dependence on pointers increases when there’s no real polytypic polymorphism (I haven’t played with their generic stuff yet though). Only because you have to be hyper-specific about the things you want to access rather than the description of what they are…

This is frustrating because in Haskell, I can just produce types:

func zero(x int) {
  x = 0
func main() {
  x := 5
  fmt.Println(x) // x is still 5


Methods are on struct (types) rather than type classes in Haskell. Confused about whether the notion of types and terms are synonymous in Go and Haskell.

I don’t see the point of named return values; I’d rather have types and type synonyms I guess?


Why is := not available outside of functions… seems like a random decision?