We had to solve the Studious Student exercise.
Leo wanted us to experiment how useful testing is in an exercise like this one, so we divided the group in some pairs using TDD, some pairs testing after writing the code and some pairs not testing at all.
The exercise is a bit tricky as pairs that were not testing discovered at the end, whereas the ones doing TDD or testing after coding discovered early during the process.
After an iteration of one hour, we had a very interesting debate about automatic testing.
This is my solution of the exercise in Clojure:
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
(ns studious-student.core) | |
(def ^:private join (partial apply str)) | |
(defn lexic-shortest-concat [words-list] | |
(join (sort #(compare (str %1 %2) | |
(str %2 %1)) | |
words-list))) | |
(defn- file-lines [file] | |
(rest (clojure.string/split-lines (slurp file)))) | |
(defn- line-words [line] | |
(rest (clojure.string/split line #" "))) | |
(defn- extract-words-lists [file] | |
(map line-words (file-lines file))) | |
(defn lexic-shortest-concat-lines [file] | |
(->> | |
file | |
extract-words-lists | |
(map lexic-shortest-concat))) | |
(defn studious-student [file-in file-out] | |
(spit file-out | |
(clojure.string/join | |
"\n" | |
(lexic-shortest-concat-lines file-in)))) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
(ns studious-student.core-test | |
(:use midje.sweet) | |
(:use [studious-student.core])) | |
(facts | |
"about Studious Student exercise" | |
(facts | |
"about concatenating the words in a list of words | |
to generate the lexicographically lowest possible string" | |
(fact | |
"it works for an empty words list" | |
(lexic-shortest-concat []) => "") | |
(fact | |
"it works for trivial non-empty words lists" | |
(lexic-shortest-concat | |
["facebook" "hacker" "cup" "for" "studious" "students"]) | |
=> "cupfacebookforhackerstudentsstudious" | |
(lexic-shortest-concat | |
["k" "duz" "q" "rc" "lvraw"]) => "duzklvrawqrc" | |
(lexic-shortest-concat | |
["mybea" "zdr" "yubx" "xe" "dyroiy"]) => "dyroiymybeaxeyubxzdr" | |
(lexic-shortest-concat | |
["uiuy" "hopji" "li" "j" "dcyi"])=> "dcyihopjijliuiuy") | |
(fact | |
"it also works for non-trivial word lists" | |
(lexic-shortest-concat | |
["jibw" "ji" "jp" "bw" "jibw"]) => "bwjibwjibwjijp")) | |
(facts | |
"about concatenating the words in each line of a file | |
to generate the lexicographically lowest possible strings" | |
(fact | |
"it reads a file and concatenates the words in each line | |
to generate the lexicographically lowest possible strings" | |
(lexic-shortest-concat-lines | |
"./test/studious_student/studious_student.in") | |
=> '("cupfacebookforhackerstudentsstudious" | |
"duzklvrawqrc" | |
"dyroiymybeaxeyubxzdr" | |
"bwjibwjibwjijp" | |
"dcyihopjijliuiuy")) | |
(let [out "./test/studious_student/s.out"] | |
(fact | |
"it writes an output files with the lexicographically lowest possible strings | |
of the words in each line of a given file" | |
(do (studious-student "./test/studious_student/studious_student.in" out) | |
(clojure.string/split-lines (slurp out))) | |
=> '("cupfacebookforhackerstudentsstudious" | |
"duzklvrawqrc" | |
"dyroiymybeaxeyubxzdr" | |
"bwjibwjibwjijp" | |
"dcyihopjijliuiuy") | |
(against-background (after :facts (clojure.java.io/delete-file out)))) | |
(fact | |
"it also works for the long given input file" | |
(do (studious-student "./test/studious_student/studious_student_long.in" out) | |
(slurp out) => (slurp "./test/studious_student/studious_student_long.out")) | |
(against-background (after :facts (clojure.java.io/delete-file out))))))) |
You can see all the code in this GitHub repository.
I'd like to thank Leo for being so kind and sharing his knowledge with us.
No comments:
Post a Comment