██╗░░██╗░█████╗░░██████╗██╗░░██╗
██║░░██║██╔══██╗██╔════╝██║░░██║
███████║███████║╚█████╗░███████║
██╔══██║██╔══██║░╚═══██╗██╔══██║
██║░░██║██║░░██║██████╔╝██║░░██║
╚═╝░░╚═╝╚═╝░░╚═╝╚═════╝░╚═╝░░╚═╝
hash is a SHA-256 library built around a Hash type that handles string formatting, JSON serialization, and streaming — so you don't have to juggle raw byte slices everywhere.
go get ella.to/hash@v0.0.2h := hash.FromBytes([]byte("hello world"))
fmt.Println(h.String()) // sha256-b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9
fmt.Println(h.Short()) // fcde9// From a byte slice
h := hash.FromBytes(data)
// From a byte slice, reusing an existing buffer
buf := make([]byte, hash.ByteSize)
h := hash.FromBytesReuse(data, buf)
// From any io.Reader
h, err := hash.FromReader(file)When you need to process data and compute its hash at the same time, use TeeReader. The hash is calculated as data flows through the reader — nothing gets buffered in memory.
file, _ := os.Open("large-file.dat")
defer file.Close()
reader, getHash := hash.FromTeeReader(file)
// Process the data (e.g. upload, copy, etc.)
io.Copy(destination, reader)
// Get the hash after all data has been read
h := getHash()
fmt.Println(h.String())For more control, use the struct-based TeeReader:
tr := hash.NewTeeReader(file)
buf := make([]byte, 4096)
for {
n, err := tr.Read(buf)
if n > 0 {
// process buf[:n]
}
if err == io.EOF {
break
}
}
h := tr.Hash()StreamChunk computes SHA-256 hashes for fixed-size segments of a stream. Useful when you need per-chunk integrity hashes for a larger file.
// 1 MB chunks, up to 4 chunk hashes, from a 10 MB file
sc := hash.NewStreamChunk(1024*1024, 4, 10*1024*1024)
io.Copy(sc, file)
hashes := sc.Hashes() // returns []Hash for each chunkWhen totalSize is provided, the chunk selection is randomized across the entire file for better coverage. When it's 0, the first N chunks are used.
// From the standard string format ("sha256-hex...")
h, err := hash.ParseFromString("sha256-b94d27b9...")
// From raw 32 bytes
h, err := hash.ParseFromBytes(rawBytes)Parsing validates the input — wrong lengths or malformed hex strings return clear errors.
Hash implements encoding.TextMarshaler and encoding.TextUnmarshaler, so it works with JSON out of the box:
type Document struct {
Name string `json:"name"`
Hash hash.Hash `json:"hash"`
}
doc := Document{
Name: "readme.txt",
Hash: hash.FromBytes(content),
}
data, _ := json.Marshal(doc)
// {"name":"readme.txt","hash":"sha256-b94d27b9..."}All hashes use the format sha256-<64 hex characters>, which is 71 characters total. The Short() method returns the last 5 hex characters — handy for logs and debug output.
const StringSize = 71 // full string length
const ByteSize = 32 // raw byte length// Format a hash for display (returns "nil" for nil input)
hash.Format(someHash) // "sha256-abc..."
// Print a short hash with extra context to a writer
hash.Print(os.Stdout, someHash, "uploaded successfully")
// Output: a27ae uploaded successfullyHash generation functions (FromBytes, FromReader, etc.) are safe to call from multiple goroutines. Each call operates on independent state. Individual TeeReader instances should be used from a single goroutine, same as any io.Reader.
MIT — see LICENSE for details.