Chapter mini: word counter

Time to combine the chapter into one small program. You will write a word counter that takes a paragraph of text, tokenises it, counts how many times each word appears, and prints the results sorted alphabetically by word.

This exercise is a straight-line use of slices, maps, ranging, sorting, and strings. It is the shape of dozens of small Go utilities in the wild.

The specification

Given this paragraph as a string literal in main:

text := "the quick brown fox jumps over the lazy dog the quick fox"

Produce this output, one word per line, alphabetical by word:

brown: 1
dog: 1
fox: 2
jumps: 1
lazy: 1
over: 1
quick: 2
the: 3

What to use

Every ingredient has already been introduced in this chapter or in the ones before it. A clean structure is:

  • strings.Fields(text) splits the text on whitespace and returns a []string of tokens.
  • A map[string]int holds the counts. counts[word]++ works even the first time you see a word, because a missing-key read returns 0 and ++ bumps it to 1.
  • A first loop walks the tokens and updates the map.
  • A second loop collects the map's keys into a []string, pre-sized with make([]string, 0, len(counts)).
  • sort.Strings(keys) sorts the slice in place.
  • A third loop prints each key with its count using fmt.Printf("%s: %d\n", k, counts[k]).

If you get stuck, the rhythm is: tokenise first, count in a single pass, collect and sort keys, print sorted.

Task

Produce the expected output above. The word order must be alphabetical; it does not depend on the input order. Do not hard-code the counts in a literal map; they have to come from counting the tokens in text at runtime.

Expected output
brown: 1
dog: 1
fox: 2
jumps: 1
lazy: 1
over: 1
quick: 2
the: 3