Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make a serializable version of FunctionalProfile and FunctionalMagnetism #219

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

bmaranville
Copy link
Member

These versions of FunctionalProfile and FunctionalMagnetism differ from the original:

  • if the classes are initialized with a profile function, they will read the source with inspect.getsource.
  • if the classes are initialized with profile_source string, the string will be executed in a context with scipy and np defined, and the function that results will be pulled out. (the function is expected to be the only defined name in the locals context when the exec is complete, so the string should contain only a function definition and nothing else)
  • profile functions can only use basic math operators and anything in the np and scipy namespaces (these are embedded in the execution context), e.g.
def blobby(z, a, b):
    return scipy.special.erf(b) * np.sin(a * z)
  • parameters for the function are not added to the class as attributes
  • they are instead added to a dict attribute FunctionalProfile.profile_params
  • new attributes are added which are Parameter objects, and a Calculation object in their slot attribute:
    • rho_start
    • irho_start
    • rho_end
    • irho_end
  • the start and end attributes are now properties, which create an SLD object, e.g. start == SLD(rho=rho_start, irho=irho_start)

For FunctionalMagnetism, a further difference is that the total thickness is stored as an expression in the class. You can set it with a set_anchor function as before, but now that function will construct a thickness expression, or you can pass in an existing Parameter or Expression, e.g. layer[3].thickness + layer[4].thickness

@bmaranville
Copy link
Member Author

Given that these changes will probably break the way that e.g. molgroups use FunctionalProfile, we should probably offer them as a new type rather than replacing the existing classes.

@acaruana2009
Copy link
Contributor

Given that these changes will probably break the way that e.g. molgroups use FunctionalProfile, we should probably offer them as a new type rather than replacing the existing classes.

I am happy with this. How would it work in principle? Would we just have a different name for the class/type?

@pkienzle
Copy link
Member

Not loving the increased verbosity of layer.profile_params["period"] but I don't have a counterproposal. The class already has a parameters attribute, so simplifying it to pars or params would be confusing.

Does the existing layer.period use too much behind-the-scenes magic? Do the dynamic attributes interfere with dataclass serialization?

Maybe have a profile_parameters attribute to support serialization/deserialization and document it as an internal parameter that should be ignored in manual.

start/end as properties could be problematic since they create a new SLD object each time. That is, layer.start == layer.start will be false. I'm not sure this matters, though, since the underlying parameters are shared.

Is there a case for end users providing {rho,irho}_{start,end} parameters when creating a profile? Or are these also noise required for dataclass deserialization?

@bmaranville
Copy link
Member Author

Actually, the way dataclass equality comparison works the result is

(layer.start == layer.start) is True

(it compares attributes, and they're all equal)

The dynamic attributes don't specifically interfere with serialization, but add maintenance overhead for the developers (weird bugs!) and cognitive overhead for the users (not a common pattern in python anymore, and full of footguns).

The rho_start etc. are by default None, and I agree most users won't initialize them directly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants