Hi @GuilhemH
Yes, you are right about the typo in the current dc equation (line 166). I have just pushed a fix to the GitHub repo. I've actually changed the linear function definition as I prefer 'ax+b' to 'a + bx' (slope always comes first in my mind 🙂).
Regarding the builtin compileDownhole function: this function takes the raw ratio data along with the beam seconds data for each selection in the ref mat. It then interpolates them on to a common timeline. You can see the need for this because there's no guarantee that the beam seconds values will align, especially in mass specs with variable sweep time. I agree that the start and end trimming should use the time steps of the interpolated data. The most likely cause of the difference between the compiled data time steps and your original data is that the builtin function aims to have 100 data points in the interpolated data. Presumably one of your selections has less than this and so the interpolated values have a different sampling rate.
If you wanted to do the same thing purely in your python script (rather than calling the cpp function) I have copied below a function that does the same as the builtin function but keeping it in python means that you can see exactly how it works. It even plots up the data for you, but I would remove that once you're happy that it's working in the way you want.
If you spot any errors with it, please let me know.
import numpy as np
# Get data series and selections
raw_ratio = data.timeSeries("Pb206/U238")
bs = data.timeSeries("BeamSeconds")
sels = data.selectionGroup("Z_91500").selections() # Change this to your reference material sel group name
# Extract time and ratio data using list comprehensions
times = [bs.dataForSelection(sel) for sel in sels]
ratio_vals = [raw_ratio.dataForSelection(sel) for sel in sels]
# Get min and max times
min_time, max_time = min(t[0] for t in times), max(t[-1] for t in times)
# Calculate average time tick using numpy operations
avg_tick = np.mean([(np.max(t) - np.min(t)) / len(t) for t in times])
# Need some common time array to show all data
num_points = int(np.round((max_time - min_time) / avg_tick)) + 1
common_time = np.linspace(min_time, max_time, num=num_points)
# Now interpolate the data to the common time array
interpolated_ratios = [np.interp(common_time, t, r) for t, r in zip(times, ratio_vals)]
# Now calculate the average ratio at each time point
avg_ratios = np.mean(interpolated_ratios, axis=0)
# And now plot the results
import matplotlib.pyplot as plt
# Change the backend to qt5agg for interactive plotting
plt.switch_backend('qt5agg')
plt.figure(figsize=(10, 6))
# Plot each selection's ratio
for i, (t, r) in enumerate(zip(times, ratio_vals)):
plt.plot(t, r, color='lightgrey')
plt.plot(common_time, avg_ratios, label='Average Pb206/U238 Ratio', color='blue')
plt.xlabel('Time (seconds)')
plt.ylabel('Pb206/U238 Ratio')
plt.title('Average Pb206/U238 Ratio Over Time')
plt.legend()
plt.grid()
plt.show()
You can use the above in iolite's Python Workspace. Just make sure to update the name of your primary RM if it isn't Z_91500.
Please let me know how it goes.
Kind regards,
Bence