diff --git a/CHANGELOG.md b/CHANGELOG.md
index 1452801..76b9e7c 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,52 +1,58 @@
# `dumpster` Changelog
+## 1.0.0
+
+### Breaking changes
+
+- Rename `Collectable` to `Trace`.
+
## 0.2.1
### New features
-- Implement `Collectable` for `std::any::TypeId`.
+- Implement `Collectable` for `std::any::TypeId`.
## 0.2.0
### New features
-- Added `Gc::as_ptr`.
-- Added `Gc::ptr_eq`.
-- Implemented `PartialEq` and `Eq` for garbage collected pointers.
+- Added `Gc::as_ptr`.
+- Added `Gc::ptr_eq`.
+- Implemented `PartialEq` and `Eq` for garbage collected pointers.
### Other
-- Changed license from GNU GPLv3 or later to MPL 2.0.
-- Allocations which do not contain `Gc`s will simply be reference counted.
+- Changed license from GNU GPLv3 or later to MPL 2.0.
+- Allocations which do not contain `Gc`s will simply be reference counted.
## 0.1.2
### New features
-- Implement `Collectable` for `OnceCell`, `HashMap`, and `BTreeMap`.
-- Add `try_clone` and `try_deref` to `unsync::Gc` and `sync::Gc`.
-- Make dereferencing `Gc` only panic on truly-dead `Gc`s.
+- Implement `Collectable` for `OnceCell`, `HashMap`, and `BTreeMap`.
+- Add `try_clone` and `try_deref` to `unsync::Gc` and `sync::Gc`.
+- Make dereferencing `Gc` only panic on truly-dead `Gc`s.
### Bugfixes
-- Prevent dead `Gc`s from escaping their `Drop` implementation, potentially causing UAFs.
-- Use fully-qualified name for `Result` in derive macro, preventing some bugs.
+- Prevent dead `Gc`s from escaping their `Drop` implementation, potentially causing UAFs.
+- Use fully-qualified name for `Result` in derive macro, preventing some bugs.
### Other
-- Improve performance in `unsync` by using `parking_lot` for concurrency primitives.
-- Improve documentation of panicking behavior in `Gc`.
-- Fix spelling mistakes in documentation.
+- Improve performance in `unsync` by using `parking_lot` for concurrency primitives.
+- Improve documentation of panicking behavior in `Gc`.
+- Fix spelling mistakes in documentation.
## 0.1.1
### Bugfixes
-- Prevent possible UAFs caused by accessing `Gc`s during `Drop` impls by panicking.
+- Prevent possible UAFs caused by accessing `Gc`s during `Drop` impls by panicking.
### Other
-- Fix spelling mistakes in documentation.
+- Fix spelling mistakes in documentation.
## 0.1.0
diff --git a/README.md b/README.md
index 2d55c56..f03fd73 100644
--- a/README.md
+++ b/README.md
@@ -7,14 +7,14 @@ It detects unreachable allocations and automatically frees them.
In short, `dumpster` offers a great mix of usability, performance, and flexibility.
-- `dumpster`'s API is a drop-in replacement for `std`'s reference-counted shared allocations
- (`Rc` and `Arc`).
-- It's very performant and has builtin implementations of both thread-local and concurrent
- garbage collection.
-- There are no restrictions on the reference structure within a garbage-collected allocation
- (references may point in any way you like).
-- It's trivial to make a custom type collectable using the provided derive macros.
-- You can even store `?Sized` data in a garbage-collected pointer!
+- `dumpster`'s API is a drop-in replacement for `std`'s reference-counted shared allocations
+ (`Rc` and `Arc`).
+- It's very performant and has builtin implementations of both thread-local and concurrent
+ garbage collection.
+- There are no restrictions on the reference structure within a garbage-collected allocation
+ (references may point in any way you like).
+- It's trivial to make a custom type Trace using the provided derive macros.
+- You can even store `?Sized` data in a garbage-collected pointer!
## How it works
@@ -34,14 +34,14 @@ garbage collector in the module `unsync`, and one thread-safe garbage collector
`sync`.
These garbage collectors can be safely mixed and matched.
-This library also comes with a derive macro for creating custom collectable types.
+This library also comes with a derive macro for creating custom Trace types.
## Examples
```rust
-use dumpster::{Collectable, unsync::Gc};
+use dumpster::{Trace, unsync::Gc};
-#[derive(Collectable)]
+#[derive(Trace)]
struct Foo {
ptr: RefCell>>,
}
@@ -59,8 +59,8 @@ let foo = Gc::new(Foo {
// If we had used `Rc` instead of `Gc`, this would have caused a memory leak.
drop(foo);
-// Trigger a collection.
-// This isn't necessary, but it guarantees that `foo` will be collected immediately (instead of
+// Trigger a collection.
+// This isn't necessary, but it guarantees that `foo` will be collected immediately (instead of
// later).
dumpster::unsync::collect();
```
@@ -71,7 +71,7 @@ To install, simply add `dumpster` as a dependency to your project.
```toml
[dependencies]
-dumpster = "0.2.1"
+dumpster = "1.0.0"
```
## Optional features
@@ -79,14 +79,14 @@ dumpster = "0.2.1"
`dumpster` has two optional features: `derive` and `coerce-unsized`.
`derive` is enabled by default.
-It enables the derive macro for `Collectable`, which makes it easy for users to implement their
-own collectable types.
+It enables the derive macro for `Trace`, which makes it easy for users to implement their
+own Trace types.
```rust
-use dumpster::{unsync::Gc, Collectable};
+use dumpster::{unsync::Gc, Trace};
use std::cell::RefCell;
-#[derive(Collectable)] // no manual implementation required
+#[derive(Trace)] // no manual implementation required
struct Foo(RefCell >>);
let my_foo = Gc::new(Foo(RefCell::new(None)));
@@ -110,7 +110,7 @@ To use `coerce-unsized`, edit your installation to `Cargo.toml` to include the f
```toml
[dependencies]
-dumpster = { version = "0.2.1", features = ["coerce-unsized"]}
+dumpster = { version = "1.0.0", features = ["coerce-unsized"]}
```
## License
diff --git a/dumpster/Cargo.toml b/dumpster/Cargo.toml
index c70860c..65cbd23 100644
--- a/dumpster/Cargo.toml
+++ b/dumpster/Cargo.toml
@@ -1,6 +1,6 @@
[package]
name = "dumpster"
-version = "0.2.1"
+version = "1.0.0"
edition = "2021"
license = "MPL-2.0"
authors = ["Clayton Ramsey"]
@@ -17,7 +17,7 @@ derive = ["dep:dumpster_derive"]
[dependencies]
parking_lot = "0.12"
-dumpster_derive = { version = "0.2.0", path = "../dumpster_derive", optional = true }
+dumpster_derive = { version = "1.0.0", path = "../dumpster_derive", optional = true }
[dev-dependencies]
fastrand = "2.0.0"
diff --git a/dumpster/src/impls.rs b/dumpster/src/impls.rs
index 080c339..f813134 100644
--- a/dumpster/src/impls.rs
+++ b/dumpster/src/impls.rs
@@ -6,7 +6,7 @@
file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
-//! Implementations of [`Collectable`] for common data types.
+//! Implementations of [`Trace`] for common data types.
#![allow(deprecated)]
@@ -37,12 +37,12 @@ use std::{
},
};
-use crate::{Collectable, Visitor};
+use crate::{Trace, Visitor};
-/// Implement `Collectable` trivially for some parametric `?Sized` type.
+/// Implement `Trace` trivially for some parametric `?Sized` type.
macro_rules! param_trivial_impl_unsized {
($x: ty) => {
- unsafe impl Collectable for $x {
+ unsafe impl Trace for $x {
#[inline]
fn accept(&self, _: &mut V) -> Result<(), ()> {
Ok(())
@@ -56,21 +56,21 @@ param_trivial_impl_unsized!(RwLockReadGuard<'static, T>);
param_trivial_impl_unsized!(&'static T);
param_trivial_impl_unsized!(PhantomData);
-unsafe impl Collectable for Box {
+unsafe impl Trace for Box {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
(**self).accept(visitor)
}
}
-unsafe impl Collectable for BuildHasherDefault {
+unsafe impl Trace for BuildHasherDefault {
fn accept(&self, _: &mut V) -> Result<(), ()> {
Ok(())
}
}
-unsafe impl<'a, T: ToOwned> Collectable for Cow<'a, T>
+unsafe impl<'a, T: ToOwned> Trace for Cow<'a, T>
where
- T::Owned: Collectable,
+ T::Owned: Trace,
{
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
if let Cow::Owned(ref v) = self {
@@ -80,14 +80,14 @@ where
}
}
-unsafe impl Collectable for RefCell {
+unsafe impl Trace for RefCell {
#[inline]
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.try_borrow().map_err(|_| ())?.accept(visitor)
}
}
-unsafe impl Collectable for Mutex {
+unsafe impl Trace for Mutex {
#[inline]
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.try_lock()
@@ -100,7 +100,7 @@ unsafe impl Collectable for Mutex {
}
}
-unsafe impl Collectable for RwLock {
+unsafe impl Trace for RwLock {
#[inline]
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.try_read()
@@ -113,7 +113,7 @@ unsafe impl Collectable for RwLock {
}
}
-unsafe impl Collectable for Option {
+unsafe impl Trace for Option {
#[inline]
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
match self {
@@ -123,7 +123,7 @@ unsafe impl Collectable for Option {
}
}
-unsafe impl Collectable for Result {
+unsafe impl Trace for Result {
#[inline]
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
match self {
@@ -133,24 +133,24 @@ unsafe impl Collectable for Result {
}
}
-unsafe impl Collectable for Cell {
+unsafe impl Trace for Cell {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.get().accept(visitor)
}
}
-unsafe impl Collectable for OnceCell {
+unsafe impl Trace for OnceCell {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.get().map_or(Ok(()), |x| x.accept(visitor))
}
}
-/// Implement [`Collectable`] for a collection data structure which has some method `iter()` that
+/// Implement [`Trace`] for a collection data structure which has some method `iter()` that
/// iterates over all elements of the data structure and `iter_mut()` which does the same over
/// mutable references.
-macro_rules! collectable_collection_impl {
+macro_rules! Trace_collection_impl {
($x: ty) => {
- unsafe impl Collectable for $x {
+ unsafe impl Trace for $x {
#[inline]
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
for elem in self {
@@ -162,17 +162,15 @@ macro_rules! collectable_collection_impl {
};
}
-collectable_collection_impl!(Vec);
-collectable_collection_impl!(VecDeque);
-collectable_collection_impl!(LinkedList);
-collectable_collection_impl!([T]);
-collectable_collection_impl!(HashSet);
-collectable_collection_impl!(BinaryHeap);
-collectable_collection_impl!(BTreeSet);
+Trace_collection_impl!(Vec);
+Trace_collection_impl!(VecDeque);
+Trace_collection_impl!(LinkedList);
+Trace_collection_impl!([T]);
+Trace_collection_impl!(HashSet);
+Trace_collection_impl!(BinaryHeap);
+Trace_collection_impl!(BTreeSet);
-unsafe impl Collectable
- for HashMap
-{
+unsafe impl Trace for HashMap {
fn accept(&self, visitor: &mut Z) -> Result<(), ()> {
for (k, v) in self {
k.accept(visitor)?;
@@ -182,7 +180,7 @@ unsafe impl Collec
}
}
-unsafe impl Collectable for BTreeMap {
+unsafe impl Trace for BTreeMap {
fn accept(&self, visitor: &mut Z) -> Result<(), ()> {
for (k, v) in self {
k.accept(visitor)?;
@@ -192,7 +190,7 @@ unsafe impl Collectable for BTreeMap {
}
}
-unsafe impl Collectable for [T; N] {
+unsafe impl Trace for [T; N] {
#[inline]
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
for elem in self {
@@ -202,11 +200,11 @@ unsafe impl Collectable for [T; N] {
}
}
-/// Implement [`Collectable`] for a trivially-collected type which contains no [`Gc`]s in its
+/// Implement [`Trace`] for a trivially-collected type which contains no [`Gc`]s in its
/// fields.
-macro_rules! collectable_trivial_impl {
+macro_rules! Trace_trivial_impl {
($x: ty) => {
- unsafe impl Collectable for $x {
+ unsafe impl Trace for $x {
#[inline]
fn accept(&self, _: &mut V) -> Result<(), ()> {
Ok(())
@@ -215,70 +213,70 @@ macro_rules! collectable_trivial_impl {
};
}
-collectable_trivial_impl!(());
-
-collectable_trivial_impl!(u8);
-collectable_trivial_impl!(u16);
-collectable_trivial_impl!(u32);
-collectable_trivial_impl!(u64);
-collectable_trivial_impl!(u128);
-collectable_trivial_impl!(usize);
-collectable_trivial_impl!(i8);
-collectable_trivial_impl!(i16);
-collectable_trivial_impl!(i32);
-collectable_trivial_impl!(i64);
-collectable_trivial_impl!(i128);
-collectable_trivial_impl!(isize);
-
-collectable_trivial_impl!(bool);
-collectable_trivial_impl!(char);
-
-collectable_trivial_impl!(f32);
-collectable_trivial_impl!(f64);
-
-collectable_trivial_impl!(AtomicU8);
-collectable_trivial_impl!(AtomicU16);
-collectable_trivial_impl!(AtomicU32);
-collectable_trivial_impl!(AtomicU64);
-collectable_trivial_impl!(AtomicUsize);
-collectable_trivial_impl!(AtomicI8);
-collectable_trivial_impl!(AtomicI16);
-collectable_trivial_impl!(AtomicI32);
-collectable_trivial_impl!(AtomicI64);
-collectable_trivial_impl!(AtomicIsize);
-
-collectable_trivial_impl!(NonZeroU8);
-collectable_trivial_impl!(NonZeroU16);
-collectable_trivial_impl!(NonZeroU32);
-collectable_trivial_impl!(NonZeroU64);
-collectable_trivial_impl!(NonZeroU128);
-collectable_trivial_impl!(NonZeroUsize);
-collectable_trivial_impl!(NonZeroI8);
-collectable_trivial_impl!(NonZeroI16);
-collectable_trivial_impl!(NonZeroI32);
-collectable_trivial_impl!(NonZeroI64);
-collectable_trivial_impl!(NonZeroI128);
-collectable_trivial_impl!(NonZeroIsize);
-
-collectable_trivial_impl!(String);
-collectable_trivial_impl!(str);
-collectable_trivial_impl!(PathBuf);
-collectable_trivial_impl!(Path);
-collectable_trivial_impl!(OsString);
-collectable_trivial_impl!(OsStr);
-
-collectable_trivial_impl!(DefaultHasher);
-collectable_trivial_impl!(RandomState);
-collectable_trivial_impl!(Rc);
-collectable_trivial_impl!(SipHasher);
-
-collectable_trivial_impl!(TypeId);
-
-/// Implement [`Collectable`] for a tuple.
-macro_rules! collectable_tuple {
+Trace_trivial_impl!(());
+
+Trace_trivial_impl!(u8);
+Trace_trivial_impl!(u16);
+Trace_trivial_impl!(u32);
+Trace_trivial_impl!(u64);
+Trace_trivial_impl!(u128);
+Trace_trivial_impl!(usize);
+Trace_trivial_impl!(i8);
+Trace_trivial_impl!(i16);
+Trace_trivial_impl!(i32);
+Trace_trivial_impl!(i64);
+Trace_trivial_impl!(i128);
+Trace_trivial_impl!(isize);
+
+Trace_trivial_impl!(bool);
+Trace_trivial_impl!(char);
+
+Trace_trivial_impl!(f32);
+Trace_trivial_impl!(f64);
+
+Trace_trivial_impl!(AtomicU8);
+Trace_trivial_impl!(AtomicU16);
+Trace_trivial_impl!(AtomicU32);
+Trace_trivial_impl!(AtomicU64);
+Trace_trivial_impl!(AtomicUsize);
+Trace_trivial_impl!(AtomicI8);
+Trace_trivial_impl!(AtomicI16);
+Trace_trivial_impl!(AtomicI32);
+Trace_trivial_impl!(AtomicI64);
+Trace_trivial_impl!(AtomicIsize);
+
+Trace_trivial_impl!(NonZeroU8);
+Trace_trivial_impl!(NonZeroU16);
+Trace_trivial_impl!(NonZeroU32);
+Trace_trivial_impl!(NonZeroU64);
+Trace_trivial_impl!(NonZeroU128);
+Trace_trivial_impl!(NonZeroUsize);
+Trace_trivial_impl!(NonZeroI8);
+Trace_trivial_impl!(NonZeroI16);
+Trace_trivial_impl!(NonZeroI32);
+Trace_trivial_impl!(NonZeroI64);
+Trace_trivial_impl!(NonZeroI128);
+Trace_trivial_impl!(NonZeroIsize);
+
+Trace_trivial_impl!(String);
+Trace_trivial_impl!(str);
+Trace_trivial_impl!(PathBuf);
+Trace_trivial_impl!(Path);
+Trace_trivial_impl!(OsString);
+Trace_trivial_impl!(OsStr);
+
+Trace_trivial_impl!(DefaultHasher);
+Trace_trivial_impl!(RandomState);
+Trace_trivial_impl!(Rc);
+Trace_trivial_impl!(SipHasher);
+
+Trace_trivial_impl!(TypeId);
+
+/// Implement [`Trace`] for a tuple.
+macro_rules! Trace_tuple {
() => {}; // This case is handled above by the trivial case
($($args:ident),*) => {
- unsafe impl<$($args: Collectable),*> Collectable for ($($args,)*) {
+ unsafe impl<$($args: Trace),*> Trace for ($($args,)*) {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
#[allow(non_snake_case)]
let &($(ref $args,)*) = self;
@@ -289,53 +287,53 @@ macro_rules! collectable_tuple {
}
}
-collectable_tuple!();
-collectable_tuple!(A);
-collectable_tuple!(A, B);
-collectable_tuple!(A, B, C);
-collectable_tuple!(A, B, C, D);
-collectable_tuple!(A, B, C, D, E);
-collectable_tuple!(A, B, C, D, E, F);
-collectable_tuple!(A, B, C, D, E, F, G);
-collectable_tuple!(A, B, C, D, E, F, G, H);
-collectable_tuple!(A, B, C, D, E, F, G, H, I);
-collectable_tuple!(A, B, C, D, E, F, G, H, I, J);
-
-/// Implement `Collectable` for one function type.
-macro_rules! collectable_fn {
+Trace_tuple!();
+Trace_tuple!(A);
+Trace_tuple!(A, B);
+Trace_tuple!(A, B, C);
+Trace_tuple!(A, B, C, D);
+Trace_tuple!(A, B, C, D, E);
+Trace_tuple!(A, B, C, D, E, F);
+Trace_tuple!(A, B, C, D, E, F, G);
+Trace_tuple!(A, B, C, D, E, F, G, H);
+Trace_tuple!(A, B, C, D, E, F, G, H, I);
+Trace_tuple!(A, B, C, D, E, F, G, H, I, J);
+
+/// Implement `Trace` for one function type.
+macro_rules! Trace_fn {
($ty:ty $(,$args:ident)*) => {
- unsafe impl Collectable for $ty {
+ unsafe impl Trace for $ty {
fn accept(&self, _: &mut V) -> Result<(), ()> { Ok(()) }
}
}
}
-/// Implement `Collectable` for all functions with a given set of args.
-macro_rules! collectable_fn_group {
+/// Implement `Trace` for all functions with a given set of args.
+macro_rules! Trace_fn_group {
() => {
- collectable_fn!(extern "Rust" fn () -> Ret);
- collectable_fn!(extern "C" fn () -> Ret);
- collectable_fn!(unsafe extern "Rust" fn () -> Ret);
- collectable_fn!(unsafe extern "C" fn () -> Ret);
+ Trace_fn!(extern "Rust" fn () -> Ret);
+ Trace_fn!(extern "C" fn () -> Ret);
+ Trace_fn!(unsafe extern "Rust" fn () -> Ret);
+ Trace_fn!(unsafe extern "C" fn () -> Ret);
};
($($args:ident),*) => {
- collectable_fn!(extern "Rust" fn ($($args),*) -> Ret, $($args),*);
- collectable_fn!(extern "C" fn ($($args),*) -> Ret, $($args),*);
- collectable_fn!(extern "C" fn ($($args),*, ...) -> Ret, $($args),*);
- collectable_fn!(unsafe extern "Rust" fn ($($args),*) -> Ret, $($args),*);
- collectable_fn!(unsafe extern "C" fn ($($args),*) -> Ret, $($args),*);
- collectable_fn!(unsafe extern "C" fn ($($args),*, ...) -> Ret, $($args),*);
+ Trace_fn!(extern "Rust" fn ($($args),*) -> Ret, $($args),*);
+ Trace_fn!(extern "C" fn ($($args),*) -> Ret, $($args),*);
+ Trace_fn!(extern "C" fn ($($args),*, ...) -> Ret, $($args),*);
+ Trace_fn!(unsafe extern "Rust" fn ($($args),*) -> Ret, $($args),*);
+ Trace_fn!(unsafe extern "C" fn ($($args),*) -> Ret, $($args),*);
+ Trace_fn!(unsafe extern "C" fn ($($args),*, ...) -> Ret, $($args),*);
}
}
-collectable_fn_group!();
-collectable_fn_group!(A);
-collectable_fn_group!(A, B);
-collectable_fn_group!(A, B, C);
-collectable_fn_group!(A, B, C, D);
-collectable_fn_group!(A, B, C, D, E);
-collectable_fn_group!(A, B, C, D, E, F);
-collectable_fn_group!(A, B, C, D, E, F, G);
-collectable_fn_group!(A, B, C, D, E, F, G, H);
-collectable_fn_group!(A, B, C, D, E, F, G, H, I);
-collectable_fn_group!(A, B, C, D, E, F, G, H, I, J);
+Trace_fn_group!();
+Trace_fn_group!(A);
+Trace_fn_group!(A, B);
+Trace_fn_group!(A, B, C);
+Trace_fn_group!(A, B, C, D);
+Trace_fn_group!(A, B, C, D, E);
+Trace_fn_group!(A, B, C, D, E, F);
+Trace_fn_group!(A, B, C, D, E, F, G);
+Trace_fn_group!(A, B, C, D, E, F, G, H);
+Trace_fn_group!(A, B, C, D, E, F, G, H, I);
+Trace_fn_group!(A, B, C, D, E, F, G, H, I, J);
diff --git a/dumpster/src/lib.rs b/dumpster/src/lib.rs
index 205f92d..d260c51 100644
--- a/dumpster/src/lib.rs
+++ b/dumpster/src/lib.rs
@@ -35,7 +35,7 @@
//! garbage collection.
//! - There are no restrictions on the reference structure within a garbage-collected allocation
//! (references may point in any way you like).
-//! - It's trivial to make a custom type collectable using the provided derive macros.
+//! - It's trivial to make a custom type Trace using the provided derive macros.
//! - You can even store `?Sized` data in a garbage-collected pointer!
//!
//! # Module structure
@@ -48,7 +48,7 @@
//! it is recommended to use `unsync`.
//!
//! The project root contains common definitions across both `sync` and `unsync`.
-//! Types which implement [`Collectable`] can immediately be used in `unsync`, but in order to use
+//! Types which implement [`Trace`] can immediately be used in `unsync`, but in order to use
//! `sync`'s garbage collector, the types must also implement [`Sync`].
//!
//! # Examples
@@ -89,10 +89,10 @@
//! It's trivial to use custom data structures with the provided derive macro.
//!
//! ```
-//! use dumpster::{unsync::Gc, Collectable};
+//! use dumpster::{unsync::Gc, Trace};
//! use std::cell::RefCell;
//!
-//! #[derive(Collectable)]
+//! #[derive(Trace)]
//! struct Foo {
//! refs: RefCell>>,
//! }
@@ -114,7 +114,7 @@
//!
//! ```toml
//! [dependencies]
-//! dumpster = "0.2.1"
+//! dumpster = "1.0.0"
//! ```
//!
//! # Optional features
@@ -122,14 +122,14 @@
//! `dumpster` has two optional features: `derive` and `coerce-unsized`.
//!
//! `derive` is enabled by default.
-//! It enables the derive macro for `Collectable`, which makes it easy for users to implement their
-//! own collectable types.
+//! It enables the derive macro for `Trace`, which makes it easy for users to implement their
+//! own Trace types.
//!
//! ```
-//! use dumpster::{unsync::Gc, Collectable};
+//! use dumpster::{unsync::Gc, Trace};
//! use std::cell::RefCell;
//!
-//! #[derive(Collectable)] // no manual implementation required
+//! #[derive(Trace)] // no manual implementation required
//! struct Foo(RefCell>>);
//!
//! let my_foo = Gc::new(Foo(RefCell::new(None)));
@@ -156,7 +156,7 @@ let gc1: Gc<[u8]> = Gc::new([1, 2, 3]);
//!
//! ```toml
//! [dependencies]
-//! dumpster = { version = "0.1.2", features = ["coerce-unsized"]}
+//! dumpster = { version = "1.0.0", features = ["coerce-unsized"]}
//! ```
//!
//! # License
@@ -180,31 +180,31 @@ mod ptr;
pub mod sync;
pub mod unsync;
-/// The trait that any garbage-collectable data must implement.
+/// The trait that any garbage-Trace data must implement.
///
-/// This trait should usually be implemented by using `#[derive(Collectable)]`, using the provided
+/// This trait should usually be implemented by using `#[derive(Trace)]`, using the provided
/// macro.
-/// Only data structures using raw pointers or other magic should manually implement `Collectable`.
+/// Only data structures using raw pointers or other magic should manually implement `Trace`.
///
/// # Safety
///
/// If the implementation of this trait is incorrect, this will result in undefined behavior,
/// typically double-frees or use-after-frees.
-/// This includes [`Collectable::accept`], even though it is a safe function, since its correctness
+/// This includes [`Trace::accept`], even though it is a safe function, since its correctness
/// is required for safety.
///
/// # Examples
///
-/// Implementing `Collectable` for a scalar type which contains no garbage-collected references
+/// Implementing `Trace` for a scalar type which contains no garbage-collected references
/// is very easy.
/// Accepting a visitor is simply a no-op.
///
/// ```
-/// use dumpster::{Collectable, Visitor};
+/// use dumpster::{Trace, Visitor};
///
/// struct Foo(u8);
///
-/// unsafe impl Collectable for Foo {
+/// unsafe impl Trace for Foo {
/// fn accept(&self, visitor: &mut V) -> Result<(), ()> {
/// Ok(())
/// }
@@ -215,11 +215,11 @@ pub mod unsync;
/// fields in `accept`.
///
/// ```
-/// use dumpster::{unsync::Gc, Collectable, Visitor};
+/// use dumpster::{unsync::Gc, Trace, Visitor};
///
/// struct Bar(Gc);
///
-/// unsafe impl Collectable for Bar {
+/// unsafe impl Trace for Bar {
/// fn accept(&self, visitor: &mut V) -> Result<(), ()> {
/// self.0.accept(visitor)
/// }
@@ -230,14 +230,14 @@ pub mod unsync;
/// delegate to both fields in a consistent order:
///
/// ```
-/// use dumpster::{unsync::Gc, Collectable, Visitor};
+/// use dumpster::{unsync::Gc, Trace, Visitor};
///
/// struct Baz {
/// a: Gc,
/// b: Gc,
/// }
///
-/// unsafe impl Collectable for Baz {
+/// unsafe impl Trace for Baz {
/// fn accept(&self, visitor: &mut V) -> Result<(), ()> {
/// self.a.accept(visitor)?;
/// self.b.accept(visitor)?;
@@ -245,7 +245,7 @@ pub mod unsync;
/// }
/// }
/// ```
-pub unsafe trait Collectable {
+pub unsafe trait Trace {
/// Accept a visitor to this garbage-collected value.
///
/// Implementors of this function need only delegate to all fields owned by this value which
@@ -268,9 +268,9 @@ pub unsafe trait Collectable {
/// A visitor for a garbage collected value.
///
/// This visitor allows us to hide details of the implementation of the garbage-collection procedure
-/// from implementors of [`Collectable`].
+/// from implementors of [`Trace`].
///
-/// When accepted by a `Collectable`, this visitor will be delegated down until it reaches a
+/// When accepted by a `Trace`, this visitor will be delegated down until it reaches a
/// garbage-collected pointer.
/// Then, the garbage-collected pointer will call one of `visit_sync` or `visit_unsync`, depending
/// on which type of pointer it is.
@@ -283,7 +283,7 @@ pub trait Visitor {
/// visitor.
fn visit_sync(&mut self, gc: &sync::Gc)
where
- T: Collectable + Send + Sync + ?Sized;
+ T: Trace + Send + Sync + ?Sized;
/// Visit a thread-local garbage-collected pointer.
///
@@ -291,10 +291,10 @@ pub trait Visitor {
/// visitor.
fn visit_unsync(&mut self, gc: &unsync::Gc)
where
- T: Collectable + ?Sized;
+ T: Trace + ?Sized;
}
-// Re-export #[derive(Collectable)].
+// Re-export #[derive(Trace)].
//
// The reason re-exporting is not enabled by default is that disabling it would
// be annoying for crates that provide handwritten impls or data formats. They
@@ -303,22 +303,22 @@ pub trait Visitor {
extern crate dumpster_derive;
#[cfg(feature = "derive")]
-/// The derive macro for implementing `Collectable`.
+/// The derive macro for implementing `Trace`.
///
/// This enables users of `dumpster` to easily store custom types inside a `Gc`.
-/// To do so, simply annotate your type with `#[derive(Collectable)]`.
+/// To do so, simply annotate your type with `#[derive(Trace)]`.
///
/// # Examples
///
/// ```
-/// use dumpster::Collectable;
+/// use dumpster::Trace;
///
-/// #[derive(Collectable)]
+/// #[derive(Trace)]
/// struct Foo {
/// bar: Option>,
/// }
/// ```
-pub use dumpster_derive::Collectable;
+pub use dumpster_derive::Trace;
/// Determine whether some value contains a garbage-collected pointer.
///
@@ -326,7 +326,7 @@ pub use dumpster_derive::Collectable;
/// - `Ok(true)`: The data structure contains a garbage-collected pointer.
/// - `Ok(false)`: The data structure contains no garbage-collected pointers.
/// - `Err(())`: The data structure was accessed while we checked it for garbage-collected pointers.
-fn contains_gcs(x: &T) -> Result {
+fn contains_gcs(x: &T) -> Result {
/// A visitor structure used for determining whether some garbage-collected pointer contains a
/// `Gc` in its pointed-to value.
struct ContainsGcs(bool);
@@ -334,14 +334,14 @@ fn contains_gcs(x: &T) -> Result {
impl Visitor for ContainsGcs {
fn visit_sync(&mut self, _: &sync::Gc)
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
self.0 = true;
}
fn visit_unsync(&mut self, _: &unsync::Gc)
where
- T: Collectable + ?Sized,
+ T: Trace + ?Sized,
{
self.0 = true;
}
diff --git a/dumpster/src/sync/collect.rs b/dumpster/src/sync/collect.rs
index 523bf22..650d417 100644
--- a/dumpster/src/sync/collect.rs
+++ b/dumpster/src/sync/collect.rs
@@ -22,7 +22,7 @@ use std::{
use parking_lot::{Mutex, RwLock};
-use crate::{ptr::Erased, Collectable, Visitor};
+use crate::{ptr::Erased, Trace, Visitor};
use super::{default_collect_condition, CollectCondition, CollectInfo, Gc, GcBox, CURRENT_TAG};
@@ -168,7 +168,7 @@ pub fn notify_created_gc() {
/// be cleaned up.
pub(super) fn mark_dirty(allocation: NonNull>)
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
let box_ref = unsafe { allocation.as_ref() };
DUMPSTER.with(|dumpster| {
@@ -193,7 +193,7 @@ where
/// need to be cleaned again.
pub(super) fn mark_clean(allocation: &GcBox)
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
DUMPSTER.with(|dumpster| {
if dumpster
@@ -344,7 +344,7 @@ impl GarbageTruck {
/// # Safety
///
/// `ptr` must have been created as a pointer to a `GcBox`.
-unsafe fn dfs(
+unsafe fn dfs(
ptr: Erased,
ref_graph: &mut HashMap,
) {
@@ -396,7 +396,7 @@ struct Dfs<'a> {
impl<'a> Visitor for Dfs<'a> {
fn visit_sync(&mut self, gc: &Gc)
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
let ptr = unsafe { (*gc.ptr.get()).unwrap() };
let box_ref = unsafe { ptr.as_ref() };
@@ -468,7 +468,7 @@ impl<'a> Visitor for Dfs<'a> {
fn visit_unsync(&mut self, _: &crate::unsync::Gc)
where
- T: Collectable + ?Sized,
+ T: Trace + ?Sized,
{
unreachable!("sync Gc cannot own an unsync Gc");
}
@@ -492,7 +492,7 @@ fn mark(root: AllocationId, graph: &mut HashMap) {
/// # Safety
///
/// `ptr` must have been created from a pointer to a `GcBox`.
-unsafe fn destroy_erased(
+unsafe fn destroy_erased(
ptr: Erased,
graph: &HashMap,
) {
@@ -506,7 +506,7 @@ unsafe fn destroy_erased(
impl Visitor for PrepareForDestruction<'_> {
fn visit_sync(&mut self, gc: &crate::sync::Gc)
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
let id = AllocationId::from(unsafe { (*gc.ptr.get()).unwrap() });
if matches!(self.graph[&id].reachability, Reachability::Reachable) {
@@ -522,7 +522,7 @@ unsafe fn destroy_erased(
fn visit_unsync(&mut self, _: &crate::unsync::Gc)
where
- T: Collectable + ?Sized,
+ T: Trace + ?Sized,
{
unreachable!("no unsync members of sync Gc possible!");
}
@@ -544,7 +544,7 @@ unsafe fn destroy_erased(
/// # Safety
///
/// `ptr` must have been created as a pointer to a `GcBox`.
-unsafe fn drop_weak_zero(ptr: Erased) {
+unsafe fn drop_weak_zero(ptr: Erased) {
let mut specified = ptr.specify::>();
assert_eq!(specified.as_ref().weak.load(Ordering::Relaxed), 0);
assert_eq!(specified.as_ref().strong.load(Ordering::Relaxed), 0);
@@ -559,7 +559,7 @@ unsafe impl Sync for AllocationId {}
impl From<&GcBox> for AllocationId
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
fn from(value: &GcBox) -> Self {
AllocationId(NonNull::from(value).cast())
@@ -568,7 +568,7 @@ where
impl From>> for AllocationId
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
fn from(value: NonNull>) -> Self {
AllocationId(value.cast())
diff --git a/dumpster/src/sync/mod.rs b/dumpster/src/sync/mod.rs
index 30c61dd..dc6c04f 100644
--- a/dumpster/src/sync/mod.rs
+++ b/dumpster/src/sync/mod.rs
@@ -41,7 +41,7 @@ use std::{
sync::atomic::{fence, AtomicUsize, Ordering},
};
-use crate::{contains_gcs, ptr::Nullable, Collectable, Visitor};
+use crate::{contains_gcs, ptr::Nullable, Trace, Visitor};
use self::collect::{
collect_all_await, currently_cleaning, mark_clean, mark_dirty, n_gcs_dropped, n_gcs_existing,
@@ -79,11 +79,11 @@ use self::collect::{
/// object.
/// To prevent undefined behavior, these `Gc`s are marked as dead during collection and rendered
/// inaccessible.
-/// Dereferencing or cloning a `Gc` during the `Drop` implementation of a `Collectable` type could
+/// Dereferencing or cloning a `Gc` during the `Drop` implementation of a `Trace` type could
/// result in the program panicking to keep the program from accessing memory after freeing it.
/// If you're accessing a `Gc` during a `Drop` implementation, make sure to use the fallible
/// operations [`Gc::try_deref`] and [`Gc::try_clone`].
-pub struct Gc {
+pub struct Gc {
/// The pointer to the allocation.
ptr: UnsafeCell>>,
/// The tag information of this pointer, used for mutation detection when marking.
@@ -98,7 +98,7 @@ static CURRENT_TAG: AtomicUsize = AtomicUsize::new(0);
/// The backing allocation for a [`Gc`].
struct GcBox
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
/// The "strong" count, which is the number of extant `Gc`s to this allocation.
/// If the strong count is zero, a value contained in the allocation may be dropped, but the
@@ -116,8 +116,8 @@ where
value: T,
}
-unsafe impl Send for Gc where T: Collectable + Send + Sync + ?Sized {}
-unsafe impl Sync for Gc where T: Collectable + Send + Sync + ?Sized {}
+unsafe impl Send for Gc where T: Trace + Send + Sync + ?Sized {}
+unsafe impl Sync for Gc where T: Trace + Send + Sync + ?Sized {}
/// Begin a collection operation of the allocations on the heap.
///
@@ -213,7 +213,7 @@ pub use collect::set_collect_condition;
impl Gc
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
/// Construct a new garbage-collected value.
///
@@ -245,7 +245,7 @@ where
/// This function will return `None` if `self` is a "dead" `Gc`, which points to an
/// already-deallocated object.
/// This can only occur if a `Gc` is accessed during the `Drop` implementation of a
- /// [`Collectable`] object.
+ /// [`Trace`] object.
///
/// For a version which panics instead of returning `None`, consider using [`Deref`].
///
@@ -264,10 +264,10 @@ where
/// `Drop` implementation.
///
/// ```
- /// use dumpster::{sync::Gc, Collectable};
+ /// use dumpster::{sync::Gc, Trace};
/// use std::sync::Mutex;
///
- /// #[derive(Collectable)]
+ /// #[derive(Trace)]
/// struct Cycle(Mutex>>);
///
/// impl Drop for Cycle {
@@ -295,7 +295,7 @@ where
/// This function will return `None` if `self` is a "dead" `Gc`, which points to an
/// already-deallocated object.
/// This can only occur if a `Gc` is accessed during the `Drop` implementation of a
- /// [`Collectable`] object.
+ /// [`Trace`] object.
///
/// For a version which panics instead of returning `None`, consider using [`Clone`].
///
@@ -314,10 +314,10 @@ where
/// `Drop` implementation.
///
/// ```
- /// use dumpster::{sync::Gc, Collectable};
+ /// use dumpster::{sync::Gc, Trace};
/// use std::sync::Mutex;
///
- /// #[derive(Collectable)]
+ /// #[derive(Trace)]
/// struct Cycle(Mutex >>);
///
/// impl Drop for Cycle {
@@ -341,7 +341,7 @@ where
/// Panics if `self` is a "dead" `Gc`,
/// which points to an already-deallocated object.
/// This can only occur if a `Gc` is accessed during the `Drop` implementation of a
- /// [`Collectable`] object.
+ /// [`Trace`] object.
///
/// # Examples
///
@@ -383,7 +383,7 @@ where
impl Clone for Gc
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
/// Clone a garbage-collected reference.
/// This does not clone the underlying data.
@@ -392,7 +392,7 @@ where
///
/// This function will panic if the `Gc` being cloned points to a deallocated object.
/// This is only possible if said `Gc` is accessed during the `Drop` implementation of a
- /// `Collectable` value.
+ /// `Trace` value.
///
/// For a fallible version, refer to [`Gc::try_clone`].
///
@@ -412,10 +412,10 @@ where
/// The following example will fail, because cloning a `Gc` to a deallocated object is wrong.
///
/// ```should_panic
- /// use dumpster::{sync::Gc, Collectable};
+ /// use dumpster::{sync::Gc, Trace};
/// use std::sync::Mutex;
///
- /// #[derive(Collectable)]
+ /// #[derive(Trace)]
/// struct Cycle(Mutex>>);
///
/// impl Drop for Cycle {
@@ -450,7 +450,7 @@ where
impl Drop for Gc
where
- T: Collectable + Send + Sync + ?Sized,
+ T: Trace + Send + Sync + ?Sized,
{
fn drop(&mut self) {
if currently_cleaning() {
@@ -531,14 +531,14 @@ impl CollectInfo {
}
}
-unsafe impl Collectable for Gc {
+unsafe impl Trace for Gc {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
visitor.visit_sync(self);
Ok(())
}
}
-impl Deref for Gc {
+impl Deref for Gc {
type Target = T;
/// Dereference this pointer, creating a reference to the contained value `T`.
@@ -564,10 +564,10 @@ impl Deref for Gc {
///
/// ```should_panic
/// // This is wrong!
- /// use dumpster::{sync::Gc, Collectable};
+ /// use dumpster::{sync::Gc, Trace};
/// use std::sync::Mutex;
///
- /// #[derive(Collectable)]
+ /// #[derive(Trace)]
/// struct Bad {
/// s: String,
/// cycle: Mutex>>,
@@ -600,7 +600,7 @@ impl Deref for Gc {
impl PartialEq> for Gc
where
- T: Collectable + Send + Sync + ?Sized + PartialEq,
+ T: Trace + Send + Sync + ?Sized + PartialEq,
{
/// Test for equality on two `Gc`s.
///
@@ -631,21 +631,21 @@ where
}
}
-impl Eq for Gc where T: Collectable + Send + Sync + ?Sized + PartialEq {}
+impl Eq for Gc where T: Trace + Send + Sync + ?Sized + PartialEq {}
-impl AsRef for Gc {
+impl AsRef for Gc {
fn as_ref(&self) -> &T {
self
}
}
-impl Borrow for Gc {
+impl Borrow for Gc {
fn borrow(&self) -> &T {
self
}
}
-impl std::fmt::Pointer for Gc {
+impl std::fmt::Pointer for Gc {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
std::fmt::Pointer::fmt(&addr_of!(**self), f)
}
@@ -654,12 +654,12 @@ impl std::fmt::Pointer for Gc {
#[cfg(feature = "coerce-unsized")]
impl std::ops::CoerceUnsized> for Gc
where
- T: std::marker::Unsize + Collectable + Send + Sync + ?Sized,
- U: Collectable + Send + Sync + ?Sized,
+ T: std::marker::Unsize + Trace + Send + Sync + ?Sized,
+ U: Trace + Send + Sync + ?Sized,
{
}
-impl Debug for Gc {
+impl Debug for Gc {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(
f,
diff --git a/dumpster/src/sync/tests.rs b/dumpster/src/sync/tests.rs
index bc087cf..7b0255f 100644
--- a/dumpster/src/sync/tests.rs
+++ b/dumpster/src/sync/tests.rs
@@ -28,7 +28,7 @@ impl<'a> Drop for DropCount<'a> {
}
}
-unsafe impl Collectable for DropCount<'_> {
+unsafe impl Trace for DropCount<'_> {
fn accept(&self, _: &mut V) -> Result<(), ()> {
Ok(())
}
@@ -40,7 +40,7 @@ struct MultiRef {
count: DropCount<'static>,
}
-unsafe impl Collectable for MultiRef {
+unsafe impl Trace for MultiRef {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.refs.accept(visitor)
}
@@ -76,7 +76,7 @@ fn self_referential() {
struct Foo(Mutex>>);
static DROP_COUNT: AtomicUsize = AtomicUsize::new(0);
- unsafe impl Collectable for Foo {
+ unsafe impl Trace for Foo {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.0.accept(visitor)
}
@@ -304,14 +304,14 @@ fn malicious() {
unsafe impl Send for X {}
- unsafe impl Collectable for A {
+ unsafe impl Trace for A {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.x.accept(visitor)?;
self.y.accept(visitor)
}
}
- unsafe impl Collectable for X {
+ unsafe impl Trace for X {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.a.accept(visitor)?;
@@ -326,7 +326,7 @@ fn malicious() {
}
}
- unsafe impl Collectable for Y {
+ unsafe impl Trace for Y {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.a.accept(visitor)
}
@@ -390,7 +390,7 @@ fn fuzz() {
}
}
- unsafe impl Collectable for Alloc {
+ unsafe impl Trace for Alloc {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.refs.accept(visitor)
}
@@ -500,13 +500,13 @@ fn root_canal() {
a3: Mutex>>,
}
- unsafe impl Collectable for A {
+ unsafe impl Trace for A {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.b.accept(visitor)
}
}
- unsafe impl Collectable for B {
+ unsafe impl Trace for B {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
let n_prior_visits = B_VISIT_COUNT.fetch_add(1, Ordering::Relaxed);
self.a0.accept(visitor)?;
@@ -622,7 +622,7 @@ fn escape_dead_pointer() {
}
}
- unsafe impl Collectable for Escape {
+ unsafe impl Trace for Escape {
fn accept(&self, visitor: &mut V) -> Result<(), ()> {
self.ptr.accept(visitor)
}
diff --git a/dumpster/src/unsync/collect.rs b/dumpster/src/unsync/collect.rs
index 4a969ff..d3a12ac 100644
--- a/dumpster/src/unsync/collect.rs
+++ b/dumpster/src/unsync/collect.rs
@@ -19,7 +19,7 @@ use std::{
use crate::{
ptr::Erased,
unsync::{default_collect_condition, CollectInfo, Gc},
- Collectable, Visitor,
+ Trace, Visitor,
};
use super::{CollectCondition, GcBox};
@@ -58,7 +58,7 @@ struct AllocationId(pub NonNull>);
impl From>> for AllocationId
where
- T: Collectable + ?Sized,
+ T: Trace + ?Sized,
{
/// Get an allocation ID from a pointer to an allocation.
fn from(value: NonNull>) -> Self {
@@ -83,7 +83,7 @@ struct Cleanup {
impl Cleanup {
/// Construct a new cleanup for an allocation.
- fn new(box_ptr: NonNull>) -> Cleanup {
+ fn new(box_ptr: NonNull>) -> Cleanup {
Cleanup {
dfs_fn: apply_visitor::,
mark_fn: apply_visitor::,
@@ -98,7 +98,7 @@ impl Cleanup {
/// # Safety
///
/// `T` must be the same type that `ptr` was created with via [`ErasedPtr::new`].
-unsafe fn apply_visitor(ptr: Erased, visitor: &mut V) {
+unsafe fn apply_visitor(ptr: Erased, visitor: &mut V) {
let specified: NonNull> = ptr.specify();
let _ = specified.as_ref().value.accept(visitor);
}
@@ -164,7 +164,7 @@ impl Dumpster {
/// Mark an allocation as "dirty," implying that it may need to be swept through later to find
/// out if it has any references pointing to it.
- pub fn mark_dirty(&self, box_ptr: NonNull>) {
+ pub fn mark_dirty(&self, box_ptr: NonNull>) {
self.to_collect
.borrow_mut()
.entry(AllocationId::from(box_ptr))
@@ -173,7 +173,7 @@ impl Dumpster {
/// Mark an allocation as "cleaned," implying that the allocation is about to be destroyed and
/// therefore should not be cleaned up later.
- pub fn mark_cleaned(&self, box_ptr: NonNull>) {
+ pub fn mark_cleaned |