You can convert a single well top set to a points object.
Select (check the box) the well top you want to be points.
Right-click on an attribute for the tops layer (e.g., Z)
Select “Convert to points (using filter)
You will get a new points object.
Between ResearchGate and Google Scholar, I recently noticed a few more citations of my work. Check out those links of you’re interested. Getting more exposure slowly but surely…
ETA: Looks like Drupal really doesn’t like emoji. 🙁
This wasn’t documented anywhere (maybe it’s obvious?), but here is a short example using the included ‘bot’ dataset.
# Look at the existing factors, stored in bot@fac
# Add another arbitrary factor
# Put your new factors back into the object
# Look at your data based on the new factor
Simulation logs (one per processor used during simulation) are located in the out folder of your model, with the name(s) log.pma, log1.pma, log2.pma, etc.
Possible use for this information: Write a script that automatically copies log files to another location for later review if needed (watch folder in Python?). Bonus points if you are timestamping your notes so you can identify the right file to look at.
I left a brief race report over at Northern Plains Athletics. Had a blast.
Update: I finally looked, and this was my second-largest day of climbing ever, after the Ragnarok 105 in 2012. On that day I climbed 7,143 feet in 109 miles; at MDH I climbed 5,191 feet in 50 miles.
Midsummer update: Lots of things going on here in science world. Mostly I’ve been working with PetroMod to create some basin history models of the Williston Basin, which is fun because I get to dig way into how PetroMod works and how different variables affect hydrocarbon generation and migration.
To import formation tops to Petra, your input file needs to be set up as one row per well, one column per formation top, and each value being the depth measurement. In the example below, I am trying to rough out the depths of formations in Wyoming using only the TD (total depth) and BOTFORM (formation at total depth). Not the most precise method of building structural maps, but it will work on a large scale.
Input file has one row per well, columns api_full, TD, and BOTFORM (at least).
Caution: I tried to run this on the full WY well dataset (111,000 rows) and R used all 36 GB of my RAM and then crashed. It’s advisable to subset first.
#Read data in
#Remember column names
#Example: reshape the sample, because the whole dataset is over 100,000 rows.
#Melt according to API and formation at TD
#Cast into a new data frame
wsc<-cast(wsr, api_full ~ BOTFORM);
Output will be one row per well, one column for each formation name in BOTFORM, and TD as the value.
It does. And it can be hard. But that doesn’t mean that it needs to. What good are all these machines if we can’t use them to speed things up?
I’m not bored, so I won’t be today. Someone could, however, look at the early success of the Great Rides Fargo bikeshare program and compare:
the footprint of the stations, according to population density versus Grand Forks (or the nearby city of your choice)
the footprint of the stations, according to how much area it would cover in Grand Forks (or the nearby city of your choice)
That might be enough to start thinking seriously about bikeshare in other areas. Why not Grand Forks (another great hashtag)? Why not Winnipeg?