All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog and this project adheres to Semantic Versioning.
This update contains breaking changes that remove the raw
API with the hope of centralising on the HashTable
API in the future. You can follow the discussion and progress in #545 to discuss features you think should be added to this API that were previously only possible on the raw
API.
borsh
feature with BorshSerialize
and BorshDeserialize
impls. (#525)Assign
impls for HashSet
operators. (#529)Default
impls for iterator types. (#542)HashTable::iter_hash{,_mut}
methods. (#549)Hash{Table,Map,Set}::allocation_size
methods. (#553)Debug
and FusedIterator
for all HashTable
iterators. (#561)Iterator::fold
for all HashTable
iterators. (#561)hash_set::VacantEntry::insert
to return OccupiedEntry
. (#495)hash_set::Difference::size_hint
lower-bound. (#530)HashSet::is_disjoint
performance. (#531)equivalent
feature is now enabled by default. (#532)HashSet
operators now return a set with the same allocator. (#529)ahash
feature has been renamed to default-hasher
. (#533)Hash{Map,Set}::insert_unique_unchecked
is now unsafe. (#556)get_many_mut
and related methods was changed. (#562)raw-entry
feature, to be eventually removed. (#534, #555)raw
feature is removed; in the future, all code should be using the HashTable
API instead. (#531, #546)rykv
feature was removed; this is now provided by the rykv
crate instead. (#554)HashSet::get_or_insert_owned
was removed in favor of get_or_insert_with
. (#555)clone_from_impl
. (#511)This release was yanked due to a breaking change.
fold
implementation of iterators. (#480)ptr::invalid_mut
on nightly. (#481)HashTable
type which provides a low-level but safe API with explicit hashing. (#466)HashMap
s that use a custom allocator. (#449)Equivalent
trait from the equivalent
crate. (#442)Clone
requirement from custom allocators. (#468)allocator-api2
crate for interfacing with custom allocators on stable. (#417)Equivalent
trait to look up values without Borrow
. (#345)Hash{Map,Set}::raw_table_mut
is added which returns a mutable reference. (#404)clear
on empty tables. (#428)DrainFilter
(drain_filter
) has been renamed to ExtractIf
and no longer drops remaining elements when the iterator is dropped. #(374){Map,Set}::raw_table
now returns an immutable reference. (#404)VacantEntry
and OccupiedEntry
now use the default hasher if none is specified in generics. (#389)RawTable::data_start
now returns a NonNull
to match RawTable::data_end
. (#387)RawIter::{reflect_insert, reflect_remove}
are now unsafe. (#429)RawTable::find_potential
is renamed to find_or_find_insert_slot
and returns an InsertSlot
. (#429)RawTable::remove
now also returns an InsertSlot
. (#429)InsertSlot
can be used to insert an element with RawTable::insert_in_slot
. (#429)RawIterHash
no longer has a lifetime tied to that of the RawTable
. (#427)HashSet::raw_table
have been relaxed to not require Eq + Hash
. (#423)EntryRef::and_replace_entry_with
and OccupiedEntryRef::replace_entry_with
were changed to give a &K
instead of a &Q
to the closure.bumpalo
as an allocator with custom wrapper. Use allocator-api2
feature in bumpalo
to use it as an allocator for hashbrown
collections. (#417)#[inline(always)]
to find_inner
. (#375)RawTable::allocation_info
for empty tables. (#376)Equivalent
trait to customize key lookups. (#350)RawTable::allocation_info
which provides information about the memory usage of a table. (#371)ahash
0.8. (#357)with_hasher_in
const. (#355)RawTable
API in favor of safer alternatives:RawTable::erase_no_drop
=> Use RawTable::erase
or RawTable::remove
instead.Bucket::read
=> Use RawTable::remove
instead.Bucket::drop
=> Use RawTable::erase
instead.Bucket::write
=> Use Bucket::as_mut
instead.HashMap
allocations don't exceed isize::MAX
. (#362)RawTable::clone_from
. (#348)Entry
API for HashSet
. (#342)Extend<&'a (K, V)> for HashMap<K, V, S, A>
. (#340)RawTable
of a HashMap
. (#335)do_alloc
to reduce LLVM IR generated. (#341)RawIterRange::size_hint
. (#325)Debug
for ValuesMut
and IntoValues
. (#325)From<[T; N]>
and From<[(K, V); N]>
for HashSet
and HashMap
respectively. (#297)allocator()
getter to HashMap and HashSet. (#257)insert_unique_unchecked
to HashMap
and HashSet
. (#293)into_keys
and into_values
to HashMap. (#295)From<array>
on HashSet
and HashMap
. (#298)entry_ref
API to HashMap
. (#201)find
. (#279)BuildHasher::hash_one
when feature = "nightly"
is enabled. (#292)Debug
for HashSet
. (#296)get_each_mut
to get_many_mut
and align API with the stdlib. (#291)RawTable::insert_no_grow
unsafe. (#254)static_empty
. (#280)HashMap
's and HashSet
's Clone
impls. (#252)pub
modifier to BumpWrapper
. (#251)try_insert_no_grow
method to RawTable
. (#229)bumpalo
as an allocator without the nightly
feature. (#231)Default
for RawTable
. (#237)RawTable::get_each_mut
, HashMap::get_each_mut
, and HashMap::get_each_key_value_mut
. (#239)From<HashMap<T, ()>>
for HashSet<T>
. (#235)try_insert
method to HashMap
. (#247)aHash
, which was resulting in inconsistent hashes being generated for a key. (#248)This release was yanked due to inconsistent hashes being generated with the nightly
feature. (#248)
RawTable
, HashSet
and HashMap
over an allocator. (#133)RawTable
's reserve functions once per key-value. (#204)RawTable
(#202):get
: find
and as_ref
get_mut
: find
and as_mut
insert_entry
: insert
and as_mut
remove_entry
: find
and remove
erase_entry
: find
and erase
from_key_hashed_nocheck
's Q: Hash
. (#200)RawTable::drain
safe. (#201)drain_filter
now removes and yields items that do match the predicate, rather than items that don't. This is a breaking change to match the behavior of the drain_filter
methods in std
. (#187)replace_entry_with
to OccupiedEntry
, and and_replace_entry_with
to Entry
. (#190)FusedIterator
and size_hint
for DrainFilter
. (#188)crossbeam
dependency). (#193)ahash
dependency to 0.4. (#198)HashMap::with_hasher
and HashSet::with_hasher
are now const fn
. (#195)T: Hash + Eq
and S: BuildHasher
bounds on HashSet::new
, with_capacity
, with_hasher
, and with_capacity_and_hasher
. (#185)erase
and remove
to RawTable
. (#171)try_with_capacity
to RawTable
. (#174)RawIter
for RawDrain
, RawIntoIter
, and RawParIter
. (#175)reflect_remove
and reflect_insert
to RawIter
. (#175)drain_filter
function to HashSet
. (#179)RawTable::erase_no_drop
in favor of erase
and remove
. (#176)insert_no_grow
is now exposed under the "raw"
feature. (#180)RawTable::par_iter
as unsafe
. (#157)HashMap
. (#159)K: Eq + Hash
bounds on retain
. (#163)HashMap
changes from rust-lang/rust (#164):extend_one
support on nightly.CollectionAllocErr
renamed to TryReserveError
.HashSet::get_or_insert_owned
.Default
for HashSet
no longer requires T: Eq + Hash
and S: BuildHasher
.or_insert_with_key
to Entry
. (#152)Clone
optimization which was unsound. (#154)const-random
by default, which prevented reproducible builds. (#155)repeat
function. (#150)NonNull
for buckets, which improves codegen for iterators. (#148)HashMap::get_key_value_mut
. (#145)Clone
implementation. (#146)drain_filter
function to HashMap
. (#135)ahash
dependency to 0.3. (#141)raw_entry
can now be used without requiring S: BuildHasher
. (#123)RawTable::bucket_index
can now be used under the raw
feature. (#128)ahash-compile-time-rng
feature (enabled by default) which allows disabling the compile-time-rng
feature in ahash
to work around a Cargo bug. (#125)inline-more
feature (enabled by default) which allows choosing a tradeoff between runtime performance and compilation time. (#119)Entry::insert
and RawEntryMut::insert
. (#118)Group::static_empty
was changed from a const
to a static
(#116).std
. (#110)rand
dependency).This release was yanked due to a breaking change for users of no-default-features
.
RawTable
API is available under the “raw” feature. (#108)HashSet
. (#98)hashbrown
is now fully no_std
on recent Rust versions (1.36+). (#96)RawOccupiedEntryMut
now properly implements Send
and Sync
. (#100)lazy_static
version. (#92)Send
trait bounds on IterMut
not matching the libstd one. (#82)insert_with_hasher
to the raw_entry API to allow K: !(Hash + Eq)
. (#54)#[may_dangle]
attributes to match the libstd HashMap
. (#46)raw_entry
support (#31)#[may_dangle]
on nightly (#31)try_reserve
support (#31)IterMut
. (#31)erase_no_drop
implementation. (#26)clear
segfaults when called on an empty table. (#13)erase_no_drop
optimization not triggering in the SSE2 implementation. (#3)Send
and Sync
for hash map and iterator types. (#7)