Hamsters.js makes use of Worker Threads to accomplish it's multi-threading functionality, due to the sandbox nature of Worker Threads there are some limitations in how you can make use of them. Hamsters.js is continually working to minimize the limitations imposed on the libraries functionality however some things cannot be bypassed.
You cannot make use of localStorage or sessionStorage from within a thread, as this would expose the main thread to changes made within another thread. Additionally you cannot access the DOM (Domain Object Model) from within a thread for the same reasons as above, changes to the DOM cannot be made from anything other than the main thread.
When not making use of transferable data passed between the main page and workers is copied, not shared. Objects are serialized as they're handed to the worker, and subsequently, de-serialized on the other end. The main thread and worker thread do not share the same instance, so the end result is that a duplicate is created on each end. Most environments implement this feature as structured cloning.
The above means there are some limitations in what data can and cannot be passed to a thread, the following sourced from structured cloning algorithm are known limitations though they may have a workaround at the cost of slower performance.
Error and Function objects cannot be duplicated by the structured clone algorithm; attempting to do so will throw a DATA_CLONE_ERR exception.
Attempting to clone DOM nodes will likewise throw a DATA_CLONE_ERR exception
Certain parameters of objects are not preserved:
The lastIndex field of RegExp objects is not preserved.
Property descriptors, setters, and getters (as well as similar metadata-like features) are not duplicated.
For example, if an object is marked read-only using a property descriptor, it will be read-write in the duplicate, since that's the default condition.
The prototype chain does not get walked and duplicated.