Skip to content

Add inelastic sample component that changes neutron wavelengths#124

Merged
nvaytet merged 56 commits intomainfrom
inelastic-sample
Mar 11, 2026
Merged

Add inelastic sample component that changes neutron wavelengths#124
nvaytet merged 56 commits intomainfrom
inelastic-sample

Conversation

@nvaytet
Copy link
Member

@nvaytet nvaytet commented Feb 25, 2026

This PR adds a new type of component: InelasticSample.
It modifies the neutron energies according to a user-defined function for energy shift.

It is used as follows:

# Sample 1: double-peak at min and max
def double_peak(e_i):
    # Either -0.2 or 0.2 meV
    de = sc.array(dims=e_i.dims, values=rng.choice([-0.2, 0.2], size=e_i.shape), unit='meV')
    return e_i.to(unit='meV', copy=False) - de

sample1 = tof.InelasticSample(
    distance=28.0 * meter,
    name="sample",
    func=double_peak,
)

# Sample 2: normal distribution
def normal_deltae(e_i):
    de = sc.array(dims=e_i.dims, values=rng.normal(scale=0.05, size=e_i.shape), unit='meV')
    return e_i.to(unit='meV', copy=False) - de

sample2 = tof.InelasticSample(
    distance=28.0 * meter,
    name="sample",
    func=normal_deltae,
)

tof.Model(source=source, components=choppers + detectors + [sample1])
Figure 4 Figure 1 (32) Figure 1 (33)

Internal changes

Quite a bit of the internals had to be refactored, because of the way the chopper and detector logic was hard-coded inside the Model (especially the run function).
Instead of taking in choppers and detectors, the Model now accepts a flat list of components.
The effects of each component are applied onto the neutrons in succession, and the logic for those effects has been moved inside the components themselves.

I think it makes for a much cleaner structure in the model. Logic for plotting was also moved into the respective components.

Fixes #117

@nvaytet
Copy link
Member Author

nvaytet commented Feb 25, 2026

@bingli621 please take a look (and also try it out!)

self.add_detector(None)
det = self.detectors_container.children[-1]
det.distance_widget.value = comp.distance.to(unit='m').value
det.name_widget.value = comp.name
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is where we need to also update the dashboard to support samples. This will be done in a follow up PR.

@nvaytet
Copy link
Member Author

nvaytet commented Feb 26, 2026

Comments from in-person discussion:

  • Use a callable instead of a data array with probability distribution to apply the sample effects on the neutrons (more flexible)
  • Prevent neutrons from becoming unphysical (large DeltaE for large wavelengths could mean the final energy is less than zero)

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds a new InelasticSample component to the tof library (fixing #117), which allows simulating inelastic scattering by modifying neutron energies based on a user-provided energy-transfer function. Alongside this new component, the PR significantly refactors the internal architecture of Model, Result, and related classes: instead of tracking choppers and detectors separately, all components are now managed through a unified components collection, with backwards compatibility for old-style choppers/detectors constructor arguments.

Changes:

  • New InelasticSample / InelasticSampleReading classes in inelastic.py, with energy-transfer logic and new utility functions (wavelength_to_energy, energy_to_wavelength) in utils.py
  • Refactored Model, Result, and component classes so that choppers, detectors, and samples all implement a common Component.apply() interface; the old separate choppers/detectors API is kept for backwards compatibility
  • Updated docs notebook, API reference, __init__.py exports, and tests to reflect the new component system

Reviewed changes

Copilot reviewed 15 out of 15 changed files in this pull request and generated 11 comments.

Show a summary per file
File Description
src/tof/inelastic.py New InelasticSample and InelasticSampleReading classes
src/tof/component.py New Component base class; ReadingField gets nanmin/nanmax/nanmean/mean methods
src/tof/chopper.py Chopper now extends Component; ChopperReading moved here; apply() method added
src/tof/detector.py Detector now extends Component; DetectorReading moved here; apply() added
src/tof/model.py Model refactored to use unified _components dict; run() uses comp.apply() dispatch
src/tof/result.py Result unified to use _components; _get_rays refactored for multi-segment rendering
src/tof/source.py SourceParameters renamed to SourceReading; Source.from_json() added; as_readonly() updated
src/tof/utils.py New utility functions: wavelength_to_energy, energy_to_wavelength, var_from_dict, extract_component_group
src/tof/__init__.py InelasticSample/InelasticSampleReading exported; SourceParameters replaced by SourceReading
src/tof/dashboard.py Updated populate_from_instrument to iterate over flat components list
tests/inelastic_test.py New test file for InelasticSample behavior
tests/model_test.py Tests updated to parametrize legacy vs. new components API; test_neutron_time_of_flight removed
tests/result_test.py Two test lines updated from model.choppers to model.components
docs/components.ipynb New section documenting the InelasticSample component
docs/api-reference/index.md Added Component, InelasticSample; replaced SourceParameters with SourceReading
Comments suppressed due to low confidence (1)

src/tof/component.py:161

  • The abstract method Component.apply() declares a time_limit: sc.Variable parameter in its signature, but none of the concrete implementations (Chopper.apply, Detector.apply, InelasticSample.apply) accept this parameter. The call site in model.py also calls comp.apply(neutrons=neutrons) without passing time_limit. The abstract method signature should match the actual interface used by all implementations to avoid confusion.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +91 to +92
def plot(self, **kwargs) -> pp.FigureLike:
return pp.xyplot(self.energies, self.probabilities, **kwargs)
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The InelasticSample.plot() method at line 92 references self.energies and self.probabilities, but neither attribute is defined anywhere in InelasticSample. Calling plot() on an InelasticSample instance will raise an AttributeError.

Copilot uses AI. Check for mistakes.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, I forgot to remove the plot method which was left over from when the delta_e was implemented as a probability distribution rather than a function.

Comment on lines 43 to +45
raise ValueError(
f"Chopper direction must be 'clockwise' or 'anti-clockwise', got "
f"'{comp['direction']}' for component {name}."
)
choppers.append(
Chopper(
frequency=comp["frequency"]["value"]
* sc.Unit(comp["frequency"]["unit"]),
direction=_dir,
open=_array_or_none(comp, "open"),
close=_array_or_none(comp, "close"),
centers=_array_or_none(comp, "centers"),
widths=_array_or_none(comp, "widths"),
phase=comp["phase"]["value"] * sc.Unit(comp["phase"]["unit"]),
distance=comp["distance"]["value"]
* sc.Unit(comp["distance"]["unit"]),
name=name,
"Only one source is allowed, but multiple were found in the"
"instrument parameters."
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error message string on lines 43-45 is missing a space between "the" and "instrument" because of the implicit string concatenation of two adjacent string literals. The result would be "Only one source is allowed, but multiple were found in theinstrument parameters." instead of the intended "Only one source is allowed, but multiple were found in the instrument parameters.".

Copilot uses AI. Check for mistakes.
src/tof/model.py Outdated
Comment on lines +234 to +266
neutrons = self.source.data.copy(deep=False)
neutrons.masks["blocked_by_others"] = sc.zeros(
sizes=neutrons.sizes, unit=None, dtype=bool
)
neutrons.coords.update(
distance=self.source.distance, toa=neutrons.coords['birth_time']
)

result_choppers = {}
result_detectors = {}
time_limit = (
birth_time
+ ((components[-1].distance - self.source.distance) / speed).to(
unit=birth_time.unit
)
).max()
for c in components:
container = result_detectors if isinstance(c, Detector) else result_choppers
container[c.name] = c.as_dict()
container[c.name]['data'] = self.source.data.copy(deep=False)
tof = ((c.distance - self.source.distance) / speed).to(
unit=birth_time.unit, copy=False
time_unit = neutrons.coords['birth_time'].unit

readings = {}
for comp in components:
neutrons = neutrons.copy(deep=False)
toa = neutrons.coords['toa'] + (
(comp.distance - neutrons.coords['distance']) / neutrons.coords['speed']
).to(unit=time_unit, copy=False)
neutrons.coords['toa'] = toa
neutrons.coords['eto'] = toa % (1 / self.source.frequency).to(
unit=time_unit, copy=False
)
t = birth_time + tof
container[c.name]['data'].coords['toa'] = t
container[c.name]['data'].coords['eto'] = t % (
1 / self.source.frequency
).to(unit=t.unit, copy=False)
container[c.name]['data'].coords['distance'] = c.distance
container[c.name]['data'].coords['tof'] = tof
if isinstance(c, Detector):
container[c.name]['data'].masks['blocked_by_others'] = ~initial_mask
continue
m = sc.zeros(sizes=t.sizes, unit=None, dtype=bool)
to, tc = c.open_close_times(time_limit=time_limit)
container[c.name].update({'open_times': to, 'close_times': tc})
for i in range(len(to)):
m |= (t > to[i]) & (t < tc[i])
combined = initial_mask & m
container[c.name]['data'].masks['blocked_by_others'] = ~initial_mask
container[c.name]['data'].masks['blocked_by_me'] = ~m & initial_mask
initial_mask = combined

return Result(
source=self.source, choppers=result_choppers, detectors=result_detectors
)
neutrons.coords['distance'] = comp.distance

if "blocked_by_me" in neutrons.masks:
# Because we use shallow copies, we do not want to do an in-place |=
# operation here
neutrons.masks['blocked_by_others'] = neutrons.masks[
'blocked_by_others'
] | neutrons.masks.pop('blocked_by_me')

neutrons, reading = comp.apply(neutrons=neutrons)
readings[comp.name] = reading

return Result(source=self.source.as_readonly(), readings=readings)
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The tof coordinate is no longer computed during simulation. Previously the code computed tof = ((c.distance - source.distance) / speed) and stored it as a coordinate. The public ComponentReading.tof property (defined in component.py) relies on this coordinate being present in the data. Removing the coordinate without removing or updating the property means any user calling .tof on a reading will get a KeyError. Either reintroduce the tof coordinate (it can be computed as toa - birth_time) or remove the ComponentReading.tof property.

Copilot uses AI. Check for mistakes.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 I'll remove the .tof property on the ComponentReading.

Comment on lines +21 to +23
def _get_rays(
components: list[ComponentReading], pulse: int, inds: np.ndarray
) -> tuple[np.ndarray, np.ndarray]:
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The return type annotation of _get_rays declares tuple[np.ndarray, np.ndarray] but the function actually returns three arrays (x, y, c). The correct annotation should be tuple[np.ndarray, np.ndarray, np.ndarray].

Copilot uses AI. Check for mistakes.
Comment on lines +280 to +283
for group, comps in groups.items():
out += f" {group.capitalize()}s:\n"
for comp in sorted(comps, key=lambda c: c.distance):
out += f" {comp.name}: {comp._repr_stats()}\n"
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Result.__repr__ formats component group headers using group.capitalize() followed by "s". Since InelasticSampleReading.kind returns "inelastic_sample", "inelastic_sample".capitalize() produces "Inelastic_sample", resulting in the header "Inelastic_samples:" with an underscore in the middle. This looks unintentional - a more user-friendly representation like "Inelastic Samples:" should be used instead.

Copilot uses AI. Check for mistakes.
)


def test_inelastic_sample_negative_final_energyies_are_dropped():
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's a spelling error in the test function name: test_inelastic_sample_negative_final_energyies_are_dropped — "energyies" should be "energies".

Copilot uses AI. Check for mistakes.
Comment on lines +77 to +85
def __init__(
self,
distance: sc.Variable,
name: str,
func: Callable[[sc.Variable], sc.Variable],
):
self.distance = distance.to(dtype=float, copy=False)
self.name = name
self.func = func
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PR description shows InelasticSample being constructed with a delta_e parameter (e.g. delta_e=double_peak), but the actual implementation uses func as the parameter name. This discrepancy in the PR description could be confusing to readers, though the code itself is consistent with using func.

Copilot uses AI. Check for mistakes.
src/tof/model.py Outdated
Comment on lines +22 to +56
@@ -42,85 +35,85 @@ def make_beamline(instrument: dict) -> dict[str, list[Chopper] | list[Detector]]
type, see the documentation of the :class:`Chopper` and :class:`Detector`
classes for details.
"""
choppers = []
detectors = []
beamline = {"components": []}
mapping = {"chopper": Chopper, "detector": Detector}
for name, comp in instrument.items():
if comp["type"] == "chopper":
direction = comp["direction"].lower()
if direction == "clockwise":
_dir = Clockwise
elif any(x in direction for x in ("anti", "counter")):
_dir = AntiClockwise
else:
if comp["type"] == "source":
if "source" in beamline:
raise ValueError(
f"Chopper direction must be 'clockwise' or 'anti-clockwise', got "
f"'{comp['direction']}' for component {name}."
)
choppers.append(
Chopper(
frequency=comp["frequency"]["value"]
* sc.Unit(comp["frequency"]["unit"]),
direction=_dir,
open=_array_or_none(comp, "open"),
close=_array_or_none(comp, "close"),
centers=_array_or_none(comp, "centers"),
widths=_array_or_none(comp, "widths"),
phase=comp["phase"]["value"] * sc.Unit(comp["phase"]["unit"]),
distance=comp["distance"]["value"]
* sc.Unit(comp["distance"]["unit"]),
name=name,
"Only one source is allowed, but multiple were found in the"
"instrument parameters."
)
)
elif comp["type"] == "detector":
detectors.append(
Detector(
distance=comp["distance"]["value"]
* sc.Unit(comp["distance"]["unit"]),
name=name,
)
)
elif comp["type"] == "source":
beamline["source"] = Source.from_json(params=comp)
continue
else:
if comp["type"] not in mapping:
raise ValueError(
f"Unknown component type: {comp['type']} for component {name}. "
"Supported types are 'chopper', 'detector', and 'source'."
)
return {"choppers": choppers, "detectors": detectors}
beamline["components"].append(
mapping[comp["type"]].from_json(name=name, params=comp)
)
return beamline
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The docstring for make_beamline still mentions "choppers and detectors" in its description: "Create choppers and detectors from a dictionary" and "must be either 'chopper' or 'detector'", but the function now supports additional component types and returns {"components": [...], "source": ...} instead of {"choppers": [...], "detectors": [...]}. The docstring and return type hint (dict[str, list[Chopper] | list[Detector]]) are both outdated.

Copilot uses AI. Check for mistakes.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed the return type to dict[str, Source | list[Component]]

Comment on lines 177 to +201
@@ -208,21 +186,19 @@ def add(self, component: Chopper | Detector):
component:
A chopper or detector.
"""
if not isinstance(component, (Chopper | Detector)):
if not isinstance(component, Component):
raise TypeError(
f"Cannot add component of type {type(component)} to the model. "
"Only Chopper and Detector instances are allowed."
"Component must be an instance of Component or derived class, "
f"but got {type(component)}."
)
# Note that the name "source" is reserved for the source.
if component.name in chain(self.choppers, self.detectors, ("source",)):
if component.name in (*self._components, "source"):
raise KeyError(
f"Component with name {component.name} already exists. "
"If you wish to replace/update an existing component, use "
"``model.choppers['name'] = new_chopper`` or "
"``model.detectors['name'] = new_detector``."
"``model.components['name'] = new_component``."
)
container = self.choppers if isinstance(component, Chopper) else self.detectors
container[component.name] = component
self._components[component.name] = component
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The add() method docstring says "Component names must be unique across choppers and detectors" and the component parameter description says "A chopper or detector." These descriptions are outdated now that the method accepts any Component instance (including InelasticSample). They should be updated to reflect the generalized behavior.

Copilot uses AI. Check for mistakes.
Copy link
Member

@SimonHeybrock SimonHeybrock left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice refactoring (says Claude) — the polymorphic component dispatch is a clear improvement over the old separate chopper/detector logic in Model.run(). I have some architectural observations alongside a few things Copilot flagged but suppressed or didn't fully explain.

(Copilot already covered: InelasticSample.plot() undefined attrs, _get_rays return type, capitalize() in repr, outdated docstrings in make_beamline and add(), missing space in error message, test typo. I won't repeat those.)

Two items on unchanged lines:

component.py:113ComponentReading.tof property will crash at runtime
Copilot flagged this too but it bears repeating: Model.run() no longer computes or stores a tof coordinate. The .tof property tries to access data.coords["tof"] and will KeyError. Either reintroduce the coordinate (e.g. tof = toa - birth_time) in Model.run(), or remove/deprecate this property.

result.py:342Result.data excludes InelasticSample readings
The data property only iterates self.choppers and self.detectors, so InelasticSample readings are silently excluded from the DataGroup. Seems unintentional — should this iterate self._components.values() instead?

return self.toa.plot(bins=bins) + self.wavelength.plot(bins=bins)


class Component:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Component uses @abstractmethod but doesn't inherit from ABC. Without ABC as a base class, the decorator is purely cosmetic — Python won't prevent Component() from being instantiated directly. Either add ABC as a base or drop @abstractmethod.

Suggested change
class Component:
class Component(ABC):

(with from abc import ABC, abstractmethod at the top)


@abstractmethod
def apply(
self, neutrons: sc.DataArray, time_limit: sc.Variable
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot flagged this but suppressed it as "low confidence" — I think it deserves attention.

The abstract signature declares time_limit: sc.Variable but none of the three concrete implementations accept it, and Model.run() calls comp.apply(neutrons=neutrons) without it. The abstract contract doesn't match reality. This will also trip up type checkers.

Should time_limit be removed from the abstract signature?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should time_limit be removed from the abstract signature?

Yes, I just removed it this morning from the chopper apply and forgot about the base class.



class Component:
kind: str
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor inconsistency: on the Component side, kind is set as a mutable instance attribute in each __init__ (self.kind = "chopper" etc.), but on the *Reading dataclasses it's a @property. Since kind is inherent to the class (a Chopper is always a "chopper"), a class-level constant or abstract property would be more robust than relying on each subclass remembering to set it in __init__.

src/tof/utils.py Outdated
The kind of components to extract.
"""
return MappingProxyType(
{name: comp for name, comp in components.items() if kind in comp.kind}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This uses substring matching (kind in comp.kind), which is how Model.samples with kind="sample" matches InelasticSample whose kind is "inelastic_sample". That works today, but it's fragile — a future "sample_holder" kind would also match "sample".

Is the substring matching intentional? If so, it may be worth documenting that convention. If not, an exact match (or a kind hierarchy/set) would be more predictable.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The substring matching was intentional, but I agree that for now, we can just use an exact match.
I was thinking if there were different kinds of samples, we could return them all in one go. But it is unclear if this is even what we would want to do at that point.

Return the inelastic sample as a JSON-serializable dictionary.
.. versionadded:: 26.03.0
"""
return {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as_json() silently drops func (the callable), meaning a round-trip through as_json() / from_json() will lose the sample's core behavior. from_json() raises NotImplementedError, so at least deserialization fails loudly.

But Model.as_json() will happily serialize a model containing an InelasticSample, producing JSON that cannot be loaded back. Worth either:

  • Skipping InelasticSample in Model.as_json() with a warning (like the non-facility source case), or
  • Documenting this limitation

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this is an issue.
Any good idea as to how we can properly save the sample to json?
Was going with a callable the wrong thing to do?
Can we serialize the function body? That sounds brittle/potentially dangerous?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For now, I went with

Skipping InelasticSample in Model.as_json() with a warning (like the non-facility source case)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any good idea as to how we can properly save the sample to json?

Claude did not have any ideas that would work for all. Here is a summary of what it came up with.

Option 1: Represent the function as a lookup key (recommended)
Rather than serializing arbitrary callables, define a registry of known/supported scattering functions inside tof, and serialize only the name (key) of the function:

# A registry of known inelastic scattering functions
INELASTIC_FUNC_REGISTRY = {
    "linear": lambda wav_i: wav_i * 1.1,
    "debye_waller": debye_waller_func,
    # ...
}

class InelasticSample:
    def as_json(self) -> dict:
        func_name = self._func_name  # stored at construction time
        if func_name not in INELASTIC_FUNC_REGISTRY:
            raise ValueError(f"Cannot serialize unknown function: {func_name}")
        return {
            "type": "inelastic_sample",
            "distance": var_to_dict(self.distance),
            "name": self.name,
            "func": func_name,
        }

This is safe, readable, and consistent with how the rest of the codebase works (e.g. "clockwise" / "anti-clockwise" for chopper direction). The downside is that users can only use pre-registered functions — but that's a feature, not a bug, since it makes JSON files portable and reproducible.

Option 2: Serialize via module:qualname reference
If you want to support user-defined functions but still avoid eval, you can store the function's fully qualified importable name using Python's __module__ and __qualname__.

import importlib

def func_to_json(func) -> dict:
    return {
        "module": func.__module__,
        "qualname": func.__qualname__,
    }

def func_from_json(d: dict):
    mod = importlib.import_module(d["module"])
    # Walk qualname for nested classes/functions
    obj = mod
    for part in d["qualname"].split("."):
        obj = getattr(obj, part)
    return obj

This serializes as e.g. {"module": "mypackage.scattering", "qualname": "debye_waller"}. On deserialization, you import the module and look up the attribute — no eval involved. The caveat: it only works for importable (module-level) functions, not lambdas or closures.

Option 3: dill / cloudpickle + base64
If you truly need to serialize arbitrary callables (including lambdas and closures):

import base64
import cloudpickle  # or dill

def func_to_json(func) -> str:
    return base64.b64encode(cloudpickle.dumps(func)).decode()

def func_from_json(s: str):
    return cloudpickle.loads(base64.b64decode(s))

This works for almost any Python callable. However:

It's not human-readable in the JSON
It's Python-version and library-version sensitive (pickles can break across versions)
It still carries security concerns when loading from untrusted sources (pickle deserialisation can execute arbitrary code too, just like eval)
So this is really only suitable if the JSON files are treated as internal/trusted artifacts.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Option 1 sounds good though, if you combine it with the warning if a custom function is used?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not go with option 1 because I thought that users will just have custom functions most of the time, but maybe I am wrong?

I propose that we can add this later as a follow-up?

self.distance = distance.to(dtype=float, copy=False)
self.name = name
self.func = func
self.kind = "inelastic_sample"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unlike Chopper and Detector, InelasticSample has no __eq__ method. This means equality checks fall back to identity comparison (is), which may surprise users — e.g. two InelasticSample instances with the same distance/name/func won't compare equal. Intentional?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TBH, I don't know if the chopper and detector __eq__ are used anywhere by anyone.
I cannot remember why I added them in the first place (maybe I wanted to detect if the same component was being added twice?).

I considered removing them altogether, but I decided against it because I thought since it's been in the code for a while, the chances of someone using that feature is non-zero.

Would you add a __eq__ for the sample comparing functions with is? Would that be a good enough workaround?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm fine with not adding it. Using is might be worse than not supporting it at all.

to, tc = self.open_close_times(time_limit=time_limit)
for i in range(len(to)):
m |= (neutrons.coords['toa'] > to[i]) & (neutrons.coords['toa'] < tc[i])
neutrons.masks['blocked_by_me'] = (~m) & (~neutrons.masks['blocked_by_others'])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: apply() mutates the input neutrons by writing into its masks dict. This works because Model.run() makes a shallow copy beforehand, but the ownership contract is implicit.

It's fine as-is since it's an internal interface, but worth noting that apply() methods are expected to mutate the passed-in neutrons (at least for masks). If a caller ever forgets the shallow copy, this would silently corrupt shared state.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess it's relatively cheap to just use assign_coords and assign_masks everywhere just to be safe?

self,
source: Source | None = None,
components: list[Component] | tuple[Component, ...] | None = None,
choppers: list[Chopper] | tuple[Chopper, ...] | None = None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Question: is there a reason to keep the separate choppers and detectors parameters? This is already a fairly large API change (new Component base class, SourceParametersSourceReading, etc.), so I'm wondering whether it'd be cleaner to just break the old API in one go — or are there downstream users relying on the old keyword arguments?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think a lot of people have notebooks that use choppers and detectors, which is why I kept them. I don't think it's too harmful to keep them.

I dont' think anyone relied on the specific class for SourceParameters/SourceReading, as it was just "the thing that described the source in the results".

meter = sc.Unit('m')


def test_inelastic_sample_flat_distribution():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The three distribution tests (flat, double_peaked, normal) are structurally identical — only the func differs. Consider @pytest.mark.parametrize to reduce duplication.

Also, there's no coverage for edge cases like: multiple InelasticSample components in a single model, or an InelasticSample placed between two choppers (vs. after all choppers). These seem like important scenarios to verify.

Comment on lines +52 to +54
da = self.data.copy(deep=False)
da.data = da.coords[self.dim]
return da.mean().data
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note

Suggested change
da = self.data.copy(deep=False)
da.data = da.coords[self.dim]
return da.mean().data
return da.assign(da.coords[self.dim]).mean().data

etc. should work with recent Scipp.

@nvaytet
Copy link
Member Author

nvaytet commented Mar 10, 2026

I think I addressed everything now? Did I miss something?

@nvaytet nvaytet merged commit 99684b8 into main Mar 11, 2026
4 checks passed
@nvaytet nvaytet deleted the inelastic-sample branch March 11, 2026 09:10
@github-project-automation github-project-automation bot moved this from Selected to Done in Development Board Mar 11, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

Add "sample" component that changes neutron energy

4 participants