Eclipse timing for Peoria, Oregon, but times and alt/az will be close for Corvallis, Albany, Shedd, etc. I was just outside planning how I was going to lay out a white surface to try to photograph any shadow bands that might appear. Thought this might be of interest, so I did a quick cut-n-paste from a text file.
Touches down on the Oregon coast between Lincoln City and Newport at 10:15 a.m. PDT.
Following data from:
https://eclipse.gsfc.nasa.gov/SEgoogle/SEgoogle2001/SE2017Aug21Tgoogle.html
I added the PDT (-7) times.
Lat.: 44.4[redacted]° N
Long.: 123.2[redacted]9° W
Total Solar Eclipse
Duration of Totality: 1m11.4s
Magnitude: 1.003
Obscuration: 100.00%
Event Date Time (UT) PDT Alt Azi
Start of partial eclipse (C1) 2017/08/21 16:04:50. 09:04:50 6 27.6° 100.8°
Start of total eclipse (C2) : 2017/08/21 17:17:10. 10:17:10 3 39.9° 116.3°
Maximum eclipse : 2017/08/21 17:17:45. 10:17:45 9 40.0° 116.5°
End of total eclipse (C3) : 2017/08/21 17:18:21. 10:18:21 7 40.0° 116.6°
End of partial eclipse (C4) : 2017/08/21 18:37:29. 11:37:29 7 51.2° 139.5°
Some of my non-professional interests. So, little or nothing that is security-related.
Sunday, August 20, 2017
Saturday, May 13, 2017
Crab Nebula with 5 telescopes
The composite is linked above above, from this starting point for all this here on hubblesite.org. There are links to the individual images from the five telescopes involved, covering five spectral regions. Highly recommended.
- Very Large Array (radio)
- Spitzer Space Telescope (infrared)
- Hubble Space Telescope (visual)
- XMM-Newton (ultraviolet)
- Chandra X-ray Observatory (X-ray)
"We have also used the Atacama Large Millimeter/submillimeter Array (ALMA) to produce the first detailed radio continuum image of the center of the nebula at 100 GHz, although the ALMA observations were not contemporaneous with the others ..."The features near the center of the nebula were moving at ~ 20% light-speed, and over ~6 months of observations of a supernova remnant that is 'only' 6500 light years away, had moved ~2 seconds of arc. Which led to a corrupted image.
Too bad, but of course there is a lot going on in the immediate vicinity of a pulsar.
Before leaving that page, I have to point out this short linked video https://media.stsci.edu/uploads/video_file/video_attachment/4408/STScI-H-v1721a-1280x720.mp4. It morphs the images through the spectrum, and it's annotated by spectral band. There's also a version without the annotations.
If you'd like to see changes in the Crab Nebula, similar to what I mentioned mentioned, here you go. This is from Chandra as well, but the page won't load. So here it is on YouTube. Seven months of changes, very near the pulsar, in a few seconds.
So, what else goes on in a nearby supernova remnant?
A lot. As one might expect, there is a lot of very energetic physics packed into a small area, only 6500 light-years away. When the supernova which left the Crab Nebula behind was recorded by Chinese, Japanese, and middle eastern astronomers in 1054, it would have been easily brighter than any star or planet in the sky.
What began as a star on the order of 10 times the mass of the sun, left a neutron star only about 20 miles across, but weighing ~4.5 billion pounds per cubic inch. It's rotating 30 times per second, because angular momentum was conserved during the explosion. The energy it radiates doesn't stop in the X-ray regime that was observed by Chandra, but extends all the way up to ~10 TeV. That is similar to the LHC, whose initial energy was 7 TeV, but is now running at 13 TeV, but at a vastly higher power level. The LHC beam, after all, is about the diameter of a sewing needle.
The magnetic fields involved are enormous. That doesn't happen due to a conservation law, as in rotation; in fact exact mechanism is still unclear. What is clear is that their reach is large. Here's an image from NASA's Hubble Captures the Beating Heart of the Crab Nebula (recommended). That pearly blue glow is synchrotron radiation, produced by electrons spiraling in a magnetic field.
Everything near a pulsar that can be ionized is ionized due to those high-energy emissions. What's left are carbon and silicate dusts, and even those seem likely to be mostly charged, though electrostatically). So we have positively charged ions, and those now-free electrons. Those free electrons are pervasive, hence the blue glow.
That ionization, the pulsars intense gravitational and magnetic field, and rotation combine to give rise to those famous pulses that gave pulsars their name. The basic mechanism is
- some of the material ejected in the supernova is pulled back into the star
- the in-fall is caught in the magnetic fields, and pulled toward the poles
- it's beamed back out at the poles, rather like lighthouse beams
- the beam sweeps across us, in this case 30 times per second
4. is possible because the axis of rotation is not aligned with that of the magnetic field. That seems a bit counter-intuitive, but it's also the case with the sun, and even here on earth (and something that has to be considered when navigating by compass). That's too much to go into here; books have been written about the Crab. Search on 'dynamo effect' or 'dynamo theory'.
We have been learning about this for a millennium now. While the pace of discovery is accelerating, the story of the Crab Nebula is far from told.
Thursday, March 23, 2017
Birding data: there is never just one thing going on
Anomalous high counts of White-crowned Sparrow and Steller's Jay have been had here (Peoria, Linn Co., Willamette riverside) during March. WCSP counts have doubled previous records, and STJA are up by 50%. OTOH, I have yet to find my FOY Belted Kingfisher.
This probably means *nothing at all*. See https://en.wikipedia.org/wiki/Complex_adaptive_system.
First, let's toss the lack of Belted Kingfisher. There's a correlation with river levels: higher water levels than seen this year bring them in to some bottomland where they (also long-legged waders, which are also absent this year) are easily detected. That threshold hasn't been reached. The same thing might explain a complete lack of expected Great Horned and Western Screech Owls: water levels may have been high too high to support the record populations seen over the past couple of years, due to unfavorable conditions for prey. In short, the river levels seem likely to have been almost perfectly wrong.
How might those river levels effect WCSP and STJA? No idea. The data only run to 7 years, and support no conclusion.I suspect an increasing trend in passage of migrating STJA, but can't prove it. The uptick in counts is only 3 years old. In the greater scheme of things, that cannot be distinguished from noise in the data.
Might increased WCSP counts provide some evidence that something has actually changed? Sadly, no. My recording has changed, in a couple of respects.
1- Serious lack of time, for professional reasons
2- Variance in sunflower feeder habits
I think we can all understand #1. #2 is about racoons, of all things;
I used to keep a sunflower feeder hanging, and just top it off in the morning, or whenever it seemed low (working from home). A few months ago, one or more raccoons found it, and as there was no good means of safeguarding it, it was brought in at night. And (crucially) re-hung whenever I got around to it in the morning.
Whenever I got around to it tended to be at a later time than "breakfast is always served". It's likely that I saw fewer WCSP before because they could visit any time, and more later when they had to adapt to a more limited opportunity, hence concentrating their numbers.
I'm probably going to stop feeding birds now. Well, perhaps hang a hummingbird feeder (FOY *female* Rufous Hummingbird today). It would be interesting to see how the data might skew, and it's safest to do it now, while the hardships of winter are largely over, but before breeding territories are established.
I expect the yearly species count (119, two years running) to crash, but that number won't matter either--just more short-term statistical noise. Food for thought, at best. At worst, it's an entirely fallacious racoon-induced population crash.
Saturday, February 18, 2017
Total Solar Eclipse 2017
I live in a pretty cool place, from a certain nerdy viewpoint. 119 species of birds, two years running, might indicate a certain predictability, but a closer look at the data destroys that notion.
So how cool is this place, really? My subjective measures include things like species counts, whether I can get reasonable photos, etc. Subjective in this case means entirely subjective. So what might tip this place into Coolest Place I Have Ever Lived?
Yeah, that might do it. Though no photo did it justice. Film doesn't have the dynamic range, and digital cameras are worse in that respect. There's a wide pearlescant glow from the solar corona seen IRL which is entirely missing from short exposure times. This image predates optimizing over a set of stacked images.
It's a photo of a photo that has been hanging on some wall of pretty much every place I have ever lived since 1979. Yes, I am an old fart: deal with it. No, it's not related to Sauron in any way, save perhaps as being inspirational to film-makers for major production houses. Possibly. The mechanics of of how films are actually made (and taxes, payments to the Tolkien estate avoided, etc.) entirely escape me.
You may want to visit https://en.wikipedia.org/w/index.php?title=Solar_eclipse_of_February_26,_1979&oldid=761573206
That's the link as of this date: I've been burned by not specifying specifying dates. Pull quote:
That last line matters. In 1979 I was driving up the Columbia Gorge, seeing small holes of blue sky in wide overcast, and trying to judge where said blue holes might line up with the sun, during that brief period of totality. A fast car and a certain disrespect for law and order won the day. Everything lined up, and I skidded to a stop at Horsethief Lake State Park in time to see the whole event.
It was awesome, in the original sense of the word. This is the Pacific NW. I saw Mount St. Helens erupt a year later, so I'm not a stranger to drama.
So here we are, that long 40 years later, as mentioned in the above pull quote. I'm now a certifiable Old Fart who never expected to live this long. But that narrow path of totality will sweep directly over my place on August 21. In place of vile winter weather, I have the best weather of the year, and all I have to do, essentially, is walk outside. How cool is that?
Being a complete nerd, I'll go bit further. I'll spread my parachute canopy across the yard below a second-story deck, and hope for a shadow band photo opportunity, etc. But mostly, I just want to experience the event. I lack the words to describe a total solar eclipse. Perhaps that is the true meaning of 'awesome': you just can't really express it.
Of one thing I am certain: on 2017-08-21, this weird little place in small-town Oregon will become The Coolest Place That I Have Ever Lived.
So how cool is this place, really? My subjective measures include things like species counts, whether I can get reasonable photos, etc. Subjective in this case means entirely subjective. So what might tip this place into Coolest Place I Have Ever Lived?
Yeah, that might do it. Though no photo did it justice. Film doesn't have the dynamic range, and digital cameras are worse in that respect. There's a wide pearlescant glow from the solar corona seen IRL which is entirely missing from short exposure times. This image predates optimizing over a set of stacked images.
It's a photo of a photo that has been hanging on some wall of pretty much every place I have ever lived since 1979. Yes, I am an old fart: deal with it. No, it's not related to Sauron in any way, save perhaps as being inspirational to film-makers for major production houses. Possibly. The mechanics of of how films are actually made (and taxes, payments to the Tolkien estate avoided, etc.) entirely escape me.
You may want to visit https://en.wikipedia.org/w/index.php?title=Solar_eclipse_of_February_26,_1979&oldid=761573206
That's the link as of this date: I've been burned by not specifying specifying dates. Pull quote:
Many visitors traveled to the Pacific Northwest to view the eclipse,[1] since it would be the last chance to view a total solar eclipse in the United States for almost four decades. The next over the United States will be the total solar eclipse of August 21, 2017.
Although the path of totality passed through Portland, Oregon in early morning, it was not directly observable from the Portland area due to overcast skies.[2]
That last line matters. In 1979 I was driving up the Columbia Gorge, seeing small holes of blue sky in wide overcast, and trying to judge where said blue holes might line up with the sun, during that brief period of totality. A fast car and a certain disrespect for law and order won the day. Everything lined up, and I skidded to a stop at Horsethief Lake State Park in time to see the whole event.
- Shadow racing through the gorge at a thousand miles per hour
- Weird greenish light, entirely unexpected, before totality
- Shadow bands rippling across the ground.
It was awesome, in the original sense of the word. This is the Pacific NW. I saw Mount St. Helens erupt a year later, so I'm not a stranger to drama.
So here we are, that long 40 years later, as mentioned in the above pull quote. I'm now a certifiable Old Fart who never expected to live this long. But that narrow path of totality will sweep directly over my place on August 21. In place of vile winter weather, I have the best weather of the year, and all I have to do, essentially, is walk outside. How cool is that?
Being a complete nerd, I'll go bit further. I'll spread my parachute canopy across the yard below a second-story deck, and hope for a shadow band photo opportunity, etc. But mostly, I just want to experience the event. I lack the words to describe a total solar eclipse. Perhaps that is the true meaning of 'awesome': you just can't really express it.
Of one thing I am certain: on 2017-08-21, this weird little place in small-town Oregon will become The Coolest Place That I Have Ever Lived.
Saturday, February 11, 2017
Exploring Data From the Linux Command Line
A few days ago, we
saw the first signs that perhaps the worst of an unusually cold and
wet winter might be ending: a temperature over 60°F!
A neighbor commented was made
that it had been a long time since the last one, and I was curious as
to exactly how long. For reasons of my own, I keep data files on
what's recorded at the nearest weather station with what I consider to be
fairly reliable data. So it only took a couple of minutes exploratory
hacking around at a shell prompt to get my answer. Here’s what I
did, and the result I got.
grep ^161[0-2] 1606010000-1612311600 | awk '{print $1" "$5}' | grep -E 6.{3} | tail -n1
1611191600 61.0
It seems longer, but the last day of ≥ 60.0°F temperature was 2016-11-19, and as a side-effect we also get the last time of the last day: 1600 (4PM for those of you who don't use 24-hour time). We could get rid of that side-effect; they are usually a Bad Thing in code. But in this case the source is obvious (as will be shown below), and entirely beneficial. It extracts another piece of information from our data at zero computation cost. Exploratory code for the win.
Before I get into what our pipeline is doing, a note about the file. These are raw
data - fields are separated only by whitespace. Lines begin with time
and date encoded as YYMMDDTTTT. Hence the first field meaning of the result seen above, and the file name 1606010000-1612311600. It reflects the start-stop dates and times of the file. That can be a useful convention: in this case it immediately reveals that the data are incomplete. The station failed to record after 1600 on New Year's Eve.
Additionally, we can use the wordcount program in linecount mode to see that we are starting with a file containing 5541 lines (records, though there is a 3-line header, which I won't bother to filter out).
wc -l 1606010000-1612311600
5541 1606010000-1612311600
5541 1606010000-1612311600
1- grep ^161[0-2] 1606010000-1612311600, in which grep (a pattern-matching tool) is supplying all lines (records) from our file that begin (specified via ^) with 161, if the
next digit is 0-2. I was only interested in months 10-12 of 2016 (and 2016 data are all that is in this file), because I
knew the last date of ≥ 60.0°F would be in there somewhere. We now have only records from our period of interest. If we ended here, our output would be
1610010000 24.10 3.0 163.0 53.0 51.0 80.0 12.9 204.0 6.0 0.0
...
1612311600 48.70 2.0 99.0 35.0 35.0 87.0 13.3 173.0 8.0 47.0
I'm using the ellipses in place of 2608 lines of output. wc -l shows 2610. We've filtered out nearly half of our data. Now we pipe (the | character) those lines to awk.
2- awk '{print $1" "$5}', where we instruct awk (a pattern scanning and processing language, of which more later) to print only that first datetime field, a space, then
field 5, which contains the temperature, of each line of input it receives. Now we're down to only the fields of interest within our period of interest. Had we stopped here, our output would still be 2610 lines, but only 2 fields out of 11, formatted as
YYMMDDTTTT NN.N.
This 2nd stage of our filter removed about 2/3 of its incoming data. I'm just guesstimating by looking at line lengths here, but you can get accurate numbers using wc again, before and after this stage. Specify -b instead of -l to count bytes instead of lines. I'll skip the demonstration. Now we send that on to grep again, but specifying different options.
YYMMDDTTTT NN.N.
This 2nd stage of our filter removed about 2/3 of its incoming data. I'm just guesstimating by looking at line lengths here, but you can get accurate numbers using wc again, before and after this stage. Specify -b instead of -l to count bytes instead of lines. I'll skip the demonstration. Now we send that on to grep again, but specifying different options.
3- grep -E 6.{3} contains the -E (Extended) option, which
enables the {} syntax so that we can specify how many instances of a character we want to match. The preceding dot can be read as 'any one character', so a multi-character string would not match. The trailing '$' matches the end of line -- the opposite of the '^' we used the first time we used when we piped to grep. The net effect is that only content that matches a '6' followed by any 3 single characters, followed by end-of-line, will survive. Given our NN.N format for the field field, we filter out anything except 6N.N and wc -l now shows only 222 of those short lines left, of 2610. Having filtered out all but 1/7 or so of the data coming into this stage, we now we filter down to one line - our answer.
4- tail -n1, which returns only the last n lines, and specify n=1.
Because the data are in increasing time/date order (as can be seen in the output of our first filter) this gives us our last datetime, and answers our question, with greater accuracy than we had thought to ask.
If we needed the date and nothing but the date, we could modify our usage of awk, which is a pattern scanning and processing language. GNU awk has some very interesting capabilities, such as floating point math, true multidimensional arrays, etc. This entire task could have been done in awk, but I wanted to show more of the shell tools, and pipelines, not just Cool Things We Can Do With GNU awk'. [1]
If we needed the date and nothing but the date, we could modify our usage of awk, which is a pattern scanning and processing language. GNU awk has some very interesting capabilities, such as floating point math, true multidimensional arrays, etc. This entire task could have been done in awk, but I wanted to show more of the shell tools, and pipelines, not just Cool Things We Can Do With GNU awk'. [1]
The Shell Will Probably Always Belong in Your Toolbox
I often use far
more sophisticated tools when I want to take a long hard look at data. But,
file formats vary, data may be missing, etc. As a rule of thumb, you
can expect to spend half of the total time spent analyzing data just
seeing what's there, and cleaning it up. For much of that work, the shell is a great tool, and it's actually very common
to spend a bit of time using the command line to explore. In a broad view, command-line tools can help you determine, quickly, whether a particular
data source contains anything of interest at all, and if so, how much, how it's formatted, etc. And finally, the commands can be saved as part of a shell script, and used over an arbitrary number of similar data files.
To a point, anyway. Shells are slow (particularly bash). Though of course there are tools to quantify that as well, and timing work on a subset of the data can give you an idea of when you are going to have to use something else. 'time' is available as a built-in if you are using the bash shell, and any Unix or Linux will also have a 'time' binary somewhere on your search path if the appropriate package is installed. On this machine it's /usr/bin/time, packaged as 'time'. Everything else, except the shell itself, is in the 'coreutils' package. Which says something about how useful these tools are. If you aren't using them, you quite literally are not using the core of the Linux/Unix tools.
That is probably a mistake. There is a lot of data out there, stored as textual files of moderate size.
My Ulterior Motive for This
I wanted a post such that:
- I could advocate the command line, to people who seem to inappropriately default to spreadsheets, which are nothing more than another tool in the box. That box should contain several tools. Consider unstructured data. Or consider binary data formats, which are an intractable problem for both shells and spreadsheets.
- Had absolutely nothing to do with security work. Because people are going to be justifiably sensitive about exactly whose security data I might be using as an example. But everybody talks about the weather.
https://drive.google.com/open?id=0B0XLFi22OXDpR3h0UUQ1cmNWbkk
Note to self: find another home for this sort of thing. Google Drive can't even preview a text file.
Note to all: this is not a promise to keep it there for any significant period of time. If I need the space for other things (like client-related things), that file is very, very gone. I recently VVG'ed most of what was in /pub.
Note to self: find another home for this sort of thing. Google Drive can't even preview a text file.
Note to all: this is not a promise to keep it there for any significant period of time. If I need the space for other things (like client-related things), that file is very, very gone. I recently VVG'ed most of what was in /pub.
[1] I do have one idea for something I'll do with awk one of these days. Because who doesn't like univariate summary statistics combined with 4000 year old Babylonian math, and using NIST-certified results to validate (or invalidate, as the case may be) our code?
Tuesday, September 20, 2016
Greater Yellowlegs
This August, I didn't find nearly the number of species of birds that I did in 2015. This month is also a lot slower. Unlike some birders, for whom the list length is everything (insert obvious crude comparison here), I'm fine with that. Species counts are just another tool I use to try to understand what is going on, on my patch. Counts happen to be a powerful tool, if used well, but it's about understanding, not competition.
Here are a couple of Greater Yellowlegs (a sort of large sandpiper) that I don't see enough of. Work can be a pressure cooker environment, the recent news reports are usually depressing, etc. Being able to walk out the back gate, go down to the river and see a couple of neat birds and fall color, reflected in late-summer low river levels, is a welcome break.
Well. That either matters to you, or it doesn't. If not, I hope you have some other means of coping.
Here are a couple of Greater Yellowlegs (a sort of large sandpiper) that I don't see enough of. Work can be a pressure cooker environment, the recent news reports are usually depressing, etc. Being able to walk out the back gate, go down to the river and see a couple of neat birds and fall color, reflected in late-summer low river levels, is a welcome break.
Well. That either matters to you, or it doesn't. If not, I hope you have some other means of coping.
![]() |
Greater Yellowlegs, Willamette River, Linn C, OR, 2016-09.04 |
Friday, August 26, 2016
Does work really expand to fill all available hours?
That might be a perception issue. In a second effort (this week) to free up more time, I just invested half an hour to run an optimization experiment. Amazingly successful and I'll probably save 4-5 hours per week, for a month or more. Huge win, to be sure. Counting both efforts, I get 6-7 hours back.
The thing is, I didn't start really looking for optimizations until I passed a pain threshold. I expect that is pretty typical behavior for us all, and that really sucks for me, on a couple of levels.
First off is professional. Always optimizing stuff is part of the gig.
Second is just personal embarrassment, because missing a forehead-slappingly easy test for bias, is, well, personally embarrassing.
That bit of folk wisdom, that work expands to fill all available hours? Like much folk wisdom, not buying it. This was just the most recent iteration of the problem. I think it's much more about pain thresholds, and when we finally realize we can't fit that next Desired Thing into the schedule. Only then do we scurry off and find fixes for the problem.
Perhaps this a LifeHacking thing. Hard to tell: trying to follow whatever fashion is currently playing out on the Internet is usually an expertise in futility.
But I plainly need to lower my pain threshold, and optimize sooner.
The thing is, I didn't start really looking for optimizations until I passed a pain threshold. I expect that is pretty typical behavior for us all, and that really sucks for me, on a couple of levels.
First off is professional. Always optimizing stuff is part of the gig.
Second is just personal embarrassment, because missing a forehead-slappingly easy test for bias, is, well, personally embarrassing.
That bit of folk wisdom, that work expands to fill all available hours? Like much folk wisdom, not buying it. This was just the most recent iteration of the problem. I think it's much more about pain thresholds, and when we finally realize we can't fit that next Desired Thing into the schedule. Only then do we scurry off and find fixes for the problem.
Perhaps this a LifeHacking thing. Hard to tell: trying to follow whatever fashion is currently playing out on the Internet is usually an expertise in futility.
But I plainly need to lower my pain threshold, and optimize sooner.
Subscribe to:
Posts (Atom)