Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Binary evolution is not stable against timestep changes #1277

Closed
veome22 opened this issue Nov 7, 2024 · 13 comments
Closed

Binary evolution is not stable against timestep changes #1277

veome22 opened this issue Nov 7, 2024 · 13 comments
Assignees
Labels
bug Something isn't working severity_moderate This is a moderately severe bug urgency_moderate This is a moderately urgent issue

Comments

@veome22
Copy link
Collaborator

veome22 commented Nov 7, 2024

Providing custom time steps seems to result in very different binary evolution outcomes. I'm trying to use custom time steps to ensure an apples-to-apples comparison between different tidal mechanisms. But taking the time steps from KAPIL2024 and using them with any other tidal prescription (NONE, PERFECT) seems to result in a very early stellar merger. Weirdly, the same binary usually forms a DCO with all tidal prescriptions when not using custom time steps.

To Reproduce
Runing the following binary simulation with --tides-prescription KAPIL2024:

COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0  --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE  --tides-prescription KAPIL2024 

produces 0: DCO formed: (Main_Sequence_>_0.7 -> Black_Hole) + (Main_Sequence_>_0.7 -> Black_Hole).

Running the same binary, but with --tides-prescription NONE and using --timesteps-filename timesteps.txt, generated from the KAPIL2024 simulation:

COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0  --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription NONE  --timesteps-filename timesteps.txt

results in 0: User-provided timesteps not consumed: (Main_Sequence_>_0.7 -> Naked_Helium_Star_MS) + (Main_Sequence_>_0.7 -> Main_Sequence_>_0.7).

As far as I can tell, there is a huge difference when the primary goes from HG to HeMS sometime after $t=5.035870$ Myr. In both KAPIL2024 tides as well as NONE tides with default time steps, the HeMS step results in a much wider binary. But when using custom timesteps, the NONE tides system decides to significantly shrink the orbit instead.

The conditions at the end of HG are very similar across the board, so it is unclear why such drastically different outcomes happen.

Screenshot:
KAPIL2024 vs NONE tides + custom time steps.
image

Versioning:
COMPAS v03.07.05

@veome22 veome22 added bug Something isn't working severity_moderate This is a moderately severe bug urgency_moderate This is a moderately urgent issue labels Nov 7, 2024
@jeffriley
Copy link
Collaborator

jeffriley commented Nov 7, 2024

I think the problem description might be better phrased as "Binary evolution is not stable against some timestep choices" - i.e. I don't think (though I confess I haven't tested this) that the issue is that the mechanism of custom timesteps is causing a problem, just that the timestep durations are causing evolution to occur in a manner we don't expect. To test that, run a binary with detailed output, grab the timesteps and run it again using those timesteps as custom timesteps. If that produces a different outcome then yes, custom timesteps are a problem and we need to fix the mechanism. If not, then the problem is the choice of timesteps, and that might just be (for now) how COMPAS works (possibly related to #1259 and #24 - in that we know there are convergence issues that are timestep dependent).

Wait - you said you did the test I suggested above? Hmmm. Let me think about that.

@veome22 veome22 changed the title Binary evolution is not stable against custom timesteps Binary evolution is not stable against timestep changes Nov 8, 2024
@jeffriley
Copy link
Collaborator

jeffriley commented Nov 8, 2024

The only difference in the test I described should be this paragraph in the documentation:

"Note that COMPAS will use the timestep exactly as read - the timesteps read from the timesteps file are not quantised by COMPAS, and neither will they be truncated to any limit (e.g. ABSOLUTE_MINIMUM_TIMESTEP from constants.h) or multiplied by any timestep multiplier set by the timestep-multiplier option. Furthermore, no check will be made after the timestep is taken to limit the radial expansion of the star being evolved."

Otherwise, evolution should proceed the same way for both binaries.

@veome22
Copy link
Collaborator Author

veome22 commented Nov 8, 2024

@jeffriley I'm not at my computer right now, but I did actually try running a binary with the same time steps, and it worked just fine.

It's only a problem if I then change something else minor, like in this case, the choice of tidal prescription (which should not hugely affect the binary orbital properties)

@jeffriley
Copy link
Collaborator

Ah - you did change the tides prescription. Should we expect exactly the same evolution under different tides prescriptions? We know KAPIL2024 is sensitive to timestep duration, don't we?

@jeffriley
Copy link
Collaborator

It's only a problem if I then change something else minor, like in this case, the choice of tidal prescription
(which should not hugely affect the binary orbital properties)

Hmmm. Ok, I might run some tests later (busy today)

@ilyamandel
Copy link
Collaborator

I ran both with the latest code version (03.07.04), but with one change: I added --timesteps-filename timesteps.txt to the first call as well, to ensure I was really using the same timesteps in both cases. I seem to get broadly the same output (I haven't made plots, but at least the runs end at the same evolutionary stages):

$ ./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription KAPIL2024 --timesteps-filename timesteps.txt

COMPAS v03.07.04
Compact Object Mergers: Population Astrophysics and Statistics
by Team COMPAS (http://compas.science/index.html)
A binary star simulator

Go to https://compas.readthedocs.io/en/latest/index.html for the online documentation
Check https://compas.readthedocs.io/en/latest/pages/whats-new.html to see what's new in the latest release

Start generating binaries at Fri Nov 8 13:36:55 2024

0: User-provided timesteps not consumed: (Main_Sequence_>0.7 -> Naked_Helium_Star_MS) + (Main_Sequence>0.7 -> Main_Sequence>_0.7)

Generated 1 of 1 binaries requested

Simulation completed

End generating binaries at Fri Nov 8 13:36:55 2024

Clock time = 0.031479 CPU seconds
Wall time = 0000:00:00 (hhhh:mm:ss)

$ ./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription NONE --timesteps-filename timesteps.txt

COMPAS v03.07.04
Compact Object Mergers: Population Astrophysics and Statistics
by Team COMPAS (http://compas.science/index.html)
A binary star simulator

Go to https://compas.readthedocs.io/en/latest/index.html for the online documentation
Check https://compas.readthedocs.io/en/latest/pages/whats-new.html to see what's new in the latest release

Start generating binaries at Fri Nov 8 13:37:31 2024

0: User-provided timesteps not consumed: (Main_Sequence_>0.7 -> Naked_Helium_Star_MS) + (Main_Sequence>0.7 -> Main_Sequence>_0.7)

Generated 1 of 1 binaries requested

Simulation completed

End generating binaries at Fri Nov 8 13:37:31 2024

Clock time = 0.028225 CPU seconds
Wall time = 0000:00:00 (hhhh:mm:ss)

@jeffriley
Copy link
Collaborator

jeffriley commented Nov 9, 2024

I did this with COMPAS v03.07.05:

./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription KAPIL2024 --logfile-type csv --detailed-output

and harvested the timesteps from the detailed output file, creating timesteps file 'timesteps.txt'

I then did this:

./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription KAPIL2024 --logfile-type csv --detailed-output --timesteps-file timesteps.txt

and the resultant detailed output file was exactly the same as the first run. Both runs resulted in:

0: DCO formed: (Main_Sequence_>_0.7 -> Black_Hole) + (Main_Sequence_>_0.7 -> Black_Hole)

I then did this

./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription NONE--logfile-type csv --detailed-output --timesteps-file timesteps.txt

and got a very different outcome. Evolution was much shorter, and the result was:

0: User-provided timesteps not consumed: (Main_Sequence_>_0.7 -> Naked_Helium_Star_MS) + (Main_Sequence_>_0.7 -> Main_Sequence_>_0.7)

I then did this:

./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription NONE--logfile-type csv --detailed-output

and got a different result:

0: DCO formed: (Main_Sequence_>_0.7 -> Black_Hole) + (Main_Sequence_>_0.7 -> Black_Hole)

So, if we let COMPAS determine what timesteps to take, we get (I think) the result we were expecting, but if we force-feed it the smaller timesteps required for KAPIL2024 tides, things go awry (well, differently - maybe awry). But are we terribly surprised by that?

EDIT: Just checked - the timesteps file I created is exactly the same as the timesteps file uploaded by @veome22 (which was presumably the timesteps file used by @ilyamandel)

@ilyamandel
Copy link
Collaborator

Hi @jeffriley,

The match between

  1. ./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription KAPIL2024 --logfile-type csv --detailed-output

and

  1. ./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription KAPIL2024 --logfile-type csv --detailed-output --timesteps-file timesteps.txt

shows that all is well and your functionality for saving and forcing timesteps works great.

The mismatch between 2) and

  1. ./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription NONE--logfile-type csv --detailed-output --timesteps-file timesteps.txt

is not a problem per se; we have different tides prescriptions, so the binary evolution may well be different.

At first I thought that the mismatch between 3) and

  1. ./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription NONE--logfile-type csv --detailed-output

was an issue, because this indicates that one of those runs has not converged. And I even ran

  1. ./COMPAS --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0 --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE --tides-prescription NONE --logfile-type csv --detailed-output --mass-change-fraction 0.001 --radial-change-fraction 0.001

to check, enforcing smaller timesteps. But this returned the same result as 4), so it doesn't seem to be a convergence issue.

But then it occurred to me that because the evolution of the system with tides proceeds a bit differently, we may be taking shorter vs. longer timesteps in the wrong places when comparing 3) and 4). For example, at the same timestep we may have a HG star in 4), which requires smaller timesteps because of its rapid evolution, but an already stripped HeMS star in 3) [because those timesteps are taken from a run with tides, and tides, say, forced the the binary closer, leading to earlier stripping], which allowed for longer timesteps. So 4) ends up taking timesteps that are too large for the binary it is evolving, and gets flawed results.

So the summary is, no obvious problem based on this example. Now, whether tides are behaving correctly is a separate problem, I haven't checked that.

@jeffriley
Copy link
Collaborator

@veome22 I don't think there is a problem here - are you ok to close this, or do you want to do some more tests first?

@veome22
Copy link
Collaborator Author

veome22 commented Nov 11, 2024

Hi @jeffriley and @ilyamandel , thank you for the tests!

I am happy to not call this a bug, but the reason I flagged this as an issue was that, as far as I understand, stellar evolution should not be affected by tides. I think Ilya's diagnosis is correct, and the stellar type at each timestep when using different tidal prescriptions is not consistent between simulations, so taking smaller/larger timesteps can make a substantial difference. But why do the differences arise in the first place?

In the simulation with tides, the primary evolves on the MS until Time=$5.03555$ Myr, M1=$35.801476 M_\odot$, R1=$305.509356 R_\odot$, and semi-major axis=$4.206575$ AU. The proposed timestep here is dT=$ 3.228402e-04$ Myr. Somewhere between this timestep and the next, the primary evolves into an HeMS star.

In the simulation WITHOUT tides, Everything about the primary is the same, and the same size timestep is taken. The only difference is that the semi-major axis is $4.318706$ AU (slightly larger because there are no tides). In this case, the primary stays on the MS for another couple of time steps. But I don't really understand why. Is there a binding energy modification to the MS timescale?

I'm happy to close this issue if there's a reasonable (non-bug) explanation for the difference in stellar evolution.

@jeffriley
Copy link
Collaborator

In this case, the primary stays on the MS for another couple of time steps.

I don't see that - maybe we're looking at different binaries?

I run

./compas --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0  --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE  --tides-prescription KAPIL2024 --logfile-type csv --detailed-output

harvest the timesteps, then run

./compas --random-seed 0 -n 1 --initial-mass-1 40 --initial-mass-2 25 --semi-major-axis 4.0  --eccentricity 0.0 --chemically-homogeneous-evolution-mode NONE  --tides-prescription NONE --timesteps-filename kapilsteps.txt --logfile-type csv --detailed-output

The first run results in

0: DCO formed: (Main_Sequence_>_0.7 -> Black_Hole) + (Main_Sequence_>_0.7 -> Black_Hole)

whereas the second run results in

0: User-provided timesteps not consumed: (Main_Sequence_>_0.7 -> Naked_Helium_Star_MS) + (Main_Sequence_>_0.7 -> Main_Sequence_>_0.7)

but in both cases the primary evolves off the MS to a HG star at the same timestep. The difference is that in the second run the primary immediately evolves to a HeMS star, then the stars merge. In the first run the primary says as a HG star for many timesteps, then evolves to HeMS where it stays for a while, then HeHG for a bit, then BH.

@veome22
Copy link
Collaborator Author

veome22 commented Nov 11, 2024

Okay, maybe I will need to test again using the latest COMPAS version. In any case, it's likely that this is not a serious issue. I'll try first thing tomorrow and let you know

@veome22
Copy link
Collaborator Author

veome22 commented Nov 12, 2024

Quick update-- the stellar evolution is indeed fine when using custom timesteps. I don't know how, but maybe the latest COMPAS version fixed things. I now believe that the drastically different outcomes are indeed caused by something with the tidal implementation, which I will investigate. For now, I will close this issue. Thanks @jeffriley and @ilyamandel !

@veome22 veome22 closed this as completed Nov 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working severity_moderate This is a moderately severe bug urgency_moderate This is a moderately urgent issue
Projects
None yet
Development

No branches or pull requests

3 participants