Tag Archives: tech

Tesla Powerwalls Installed

My two Tesla Powerwall 2‘s were installed last week and now I’m running on my own self-generated solar power – after dark! Powerwalls not only don’t need any maintenance or fuel (unlike any fuel-burning generator), but they also let you time shift energy to avoid using power from the grid at peak demand times.  Of course this also means that now I will still have power (and, more importantly, running water!) the next time PG&E needs to shut down the power grid.  Yay!

I really like the Tesla app for configuring and monitoring your Powerwall and, if present, your solar PV system.  It continually displays the flow of power between your home, Powerwalls, solar panels and the power grid in real time.

Continue reading »

Share
Also tagged , | 0 Comments

The Unistellar eVscope

(February 2020)

I recently received my eVscope from Unistellar and after just a few sessions with it, I thought I would share my thoughts and experiences with it so far – particularly since there wasn’t a lot of info available when I ordered it in back in July of 2019.

Overview

The Unistellar eVscope is quite different from a traditional optical telescope.  It’s a highly integrated and automated digital imaging telescope that enables you to easily find and view deep sky objects in color and detail that would not normally be perceptible to your eye looking through a normal optical telescope.  In addition, the eVscope is designed to let you easily participate in and contribute data to crowd-sourced “citizen science” projects.

The eVscope is a 4.5-inch Newtonian reflector that captures light on a highly sensitive, low noise Sony IMX224 color sensor while using a motorized alt-az tracking mount and autonomous field detection to automatically identify, align and continually track its view of the sky.  Integrated image-processing software takes and combines an on-going series of short exposures to generate an image in almost real time that brings out much of the very low light, color and detail that’s not visible to the human eye even when looking through a normal telescope. This view accumulates over just seconds and minutes and is displayed both in the telescope’s eyepiece (on an OLED display) as well as on a WiFi-connected smartphone.  The whole thing is self-powered via an integrated 9-10 hour rechargeable battery, fits into a large backpack and weighs just under 20 lbs. including the provided tripod.

In other words, it’s quite an impressive level of integration!  While you can of course outfit a normal telescope and tracking mount of your choosing with the necessary cameras, computer, tracking and image stacking software, WiFi connectivity, battery power, etc., you then also have to develop the expertise to use and troubleshoot this software – and it’s not trivial. To be clear, the eVscope is not really designed to be a sophisticated imaging tool or to compete with the results you can eventually get with lots of practice and expertise and many hours of capturing and processing images.  Instead, the eVscope is intended to let you easily see and enjoy much more detail than you can with a normal, unaided telescope and it provides quick setup, ease of control from your smartphone, and a fun, real time viewing experience all wrapped up in a lovely, convenient little package.

It is however not cheap to integrate all these components into such a convenient package.  As such, I wouldn’t recommend it for someone wanting to dip their toe into astronomy on a small budget.  It’s pretty clear though that this makes for a wonderful tool for astronomy outreach programs anywhere and I’m really looking forward to sharing the experience with friends and their families.

Citizen Science

As I mentioned above, the eVscope is also designed to participate in crowd-sourced “citizen science”, in partnership with the SETI Institute.  As per their web site, the eVscope “allows users around the world to participate in observing campaigns to image and collect data on objects of special interest to researchers.  In Campaign Mode, image data is automatically sent to a data repository at the SETI Institute’s headquarters in Silicon Valley. The international scientific community can then access unprecedented volumes of image data for specific objects, from thousands of telescopes around the world, at different dates and times. This in turn, can enable new discoveries and enhance our understanding of the universe around us.”

Just in the past week, I had the opportunity to participate in one of these observing sessions.  I received an email providing instructions for a particular target and observing time to collect data on an exo-planet transit of “WASP-43b”.  The procedure involved setting up beforehand, selecting and confirming the target and then starting the Enhanced Vision capture process and letting it run autonomously for several hours as it tracked the target.  Afterwards there was the capturing of 30 seconds of “dark frames” and then initiating the download of data from the telescope followed by the upload to their servers.  While I encountered a few issues along the way (included in my bug list below), it was fun to get to participate in a data gathering session like this.

Setup and Use

I recorded a video to demonstrate the ease of setting up and using the eVscope:


I forgot to record using the focus ring on the base of the scope, so perhaps I’ll add that later, but Unistellar provides a nice page detailing how to use it with the provided Bahtinov mask: How to use the Bahtinov mask?  (It’s great how the mask is integrated into the cap!)

While I have encountered a lot of bugs (version 0.9), including fairly frequent random crashes (the app quits unexpectedly), the eVscope already provides a fun and amazing experience to share with people.  (Presumably they’ll be updating and improving the software over time.)  The ease of setup and the speed with which you can get to viewing objects is great.  I really like the convenient size of the thing, including the integrated power supply and the optional padded backpack to carry and protect it.

The initial star alignment process is super fast (around 30 seconds) and it’s autonomous field detection system seems to do a great job of tracking the sky and dealing with field rotation over several hours.  I do find the views appear slightly soft (presumably from the effort to track, align and integrate frames over many minutes) but still quite enjoyable, and perhaps this will improve with future updates.  You can see some sample images below.  I should note that I haven’t tried collimating the scope yet, so I’ll update here when I get the chance.

Another aspect of the very quick and easy setup is that it takes less than a minute to pull out the scope on a whim, stand it up on the open patio outside my bedroom, remove the cap, turn it on and dash back inside out of the cold winter night, and settle in with my phone or iPad and mess around exploring the sky, in warmth and comfort.  I definitely cannot set up and align my 8” SCT and german equatorial mount so quickly and easily even with the auto-align accessory, plus there’s setting up cameras, laptop, myriad power and USB cables, etc.  Not to forget to mention the disassembly and take down time afterwards again!

That said, I don’t think you should think of the eVscope as astrophotography gear. Everything is integrated to make it easy to observe deep sky objects with color and detail you can’t see without the aid of sensors, but it does not provide the means to capture frames and do your own stacking or more sophisticated and detailed imaging with a non-color sensor and color filters, etc. I would not expect this telescope to compete with custom gear where you have control over everything (and of course have to learn how to do everything). That is not the purpose of its design. Similarly, the cost reflects the benefits of integrating all these pieces (sensor, tracking software, stacking/imaging software, display, power supply, etc) into a small and elegant package without any cables or separate components to manage while also making it dead simple to use. That’s what you’re paying for and that’s the trade-off.

The provided documentation is pretty good in some areas but a little weak in others. (As of February 2020.) For example, I was surprised how long it took me to find a little blurb buried in a list at the back of the printed guide that explained how to tell if the battery was fully charged.  And really, the charge state should also be displayed in some form in the app!  As of 0.9.3, there is now a rough charge state shown in the app in the form of a battery icon (no percentage estimate though).

Also, what’s going on with the gain/exposure controls only usable in live view and the contrast/brightness controls only usable in enhanced vision mode?  Does changing the gain/exposure before starting enhanced vision affect the results?  There’s very little information on how best to make use of these controls or how the automatic settings work – in auto mode with enhanced vision, it seems to randomly try different contrast/brightness settings over many minutes.  It seems to go back and forth – it doesn’t seem to be working in a particular direction.

Another example is that there’s no explanation (or instructions for use) of the “download data” and “upload data” buttons.  I only learned how to use them after getting the email for the “citizen science” observation that included instructions to use those buttons.  Still leaves me wondering though as to what that data consists of and whether or not we should be regularly downloading/uploading this data.

However, it does look like Unistellar has been actively adding explanatory content to their online knowledge base over the past couple of months, so more and more questions are getting answered.

Sample Views

Here’s a couple of real time recordings of the Unistellar app showing the live view from the eVscope of the Orion Nebula (over 3.5 minutes) and Bode’s Galaxy (over 6 minutes):

       

Here are some images illustrating the views you can generate and enjoy in just minutes with the eVscope.  I’ve included both screenshots of the full image displayed on my phone as well as the circular cropped image that it produces for display in the eyepiece and that it allows you to save from your phone.  (The eyepiece shows only the circular cropped image, it does not display the descriptive text or circular outline.) I have not done any further processing on these images – these are just as they were originally generated by the eVscope app or screenshot-captured off my phone.

I’ll mention right now that I think it’s very annoying that the app does not currently provide an option to save the full, uncropped, undistorted image. The images saved from the app also unfortunately have an “eyepiece lens” effect applied to them.  I should not have to go to the trouble of capturing a screenshot of my phone and then manually cropping it to get the full, undistorted image saved as I’ve done for these samples.

Click on any image below to see the full size version and to browse the gallery:

Flame Nebula NGC 2024

Running Man Nebula NGC 1977

Horsehead Nebula

Bode’s Galaxy M81

Orion Nebula M42

The Sony IMX224 Exmor CMOS color sensor used in the eVscope has a resolution of 1305 x 977.  The circular cropped images saved by the app are 1080 x 1080 and my iPhone XS screenshot crops tend to be around 1118 x 838.

Crab Nebula M1

Andromeda Galaxy M31

Feature Requests

As of February 2020, version 0.9 of the Unistellar app, running on iOS 13:

  • PLEASE: Option to save full, uncropped image: Currently there is only an option to save or share the circular cropped image.  It’s both heavily cropped and mildly distorted around the edges to give it a sort of eyepiece lens effect.  Please provide a built-in option in the app to save the full uncropped, undistorted image!  I should not have to go to the trouble to capture a screenshot of my phone and manually crop it to get the full image.
  • Display useful status/info: Please provide current status info in the app like Enhanced Vision mode’s elapsed exposure time and/or stacked image count, current sky coordinates and battery charge state estimate.  (The elapsed exposure time and coordinates are only currently available after the fact in the saved images.)  Update: As of 0.9.3, there’s now a battery charge state icon.  No percentage estimate, but better than nothing.
  • Goto support via SkySafari: Would be wonderful to be able to use SkySafari to select and goto targets with the eVscope (as you can with many other telescopes/mounts), as well as to be able to easily see and explore where the scope is currently pointing.
  • Dedicated iPad version of app: The current iOS app is just scaled up from the phone version on the iPad and does not take advantage of all the additional screen real estate.  The iPad would be an even better platform for viewing/controlling the eVscope if the available space were well utilized.
  • Allow panning of zoomed view in the app: The app display’s zoom feature currently only zooms into the center of the image.  You can’t zoom in anywhere else in the image.
  • Improve catalog display: The current style of displaying catalog items as a grid of large icons requires that the object names often be truncated.  Also, the large generic icons to indicate the type of object aren’t a great use of the available space.  How about a list-oriented view (and a smaller icon) to make better use of the screen real estate?
  • Shared WiFi connectivity: Would be nice to be able to optionally configure the eVscope to use an available WiFi network instead of its own WiFi at times so as to 1) support extended WiFi reach, 2) allow devices to access both the internet and the telescope simultaneously, 3) to avoid the need to always switch to the telescope’s WiFi, and 4) to perhaps help avoid the occasional disconnects that happen now.
  • More expansive view through the eyepiece: I can see that the design of the eVscope was to very much provide an optical, telescope-like viewing experience – which is of course why there is an eyepiece on the scope at all.  However, I think it is a mistake to not maximize the apparent field of view in the eyepiece to provide more of a grand and wondrous view.  To that end, I wonder if you could use a different lens with the eyepiece to really open up the apparent magnification and field of view of the image you’re able to generate.

Currently you see a small constrained view far down the end of a tube.  You should really try to shoot for a big gorgeous panoramic view, a “spacewalk vista”, like what you get with TeleVue’s fantastic, wide apparent field eyepieces.  Could you simply make use of the same kind of optics and/or display technology inside the electronic viewfinders that Sony and other camera manufacturers use in their digital SLR cameras?  These digital display viewfinders do a fantastic job of enlarging the apparent view on these tiny little displays.  They’re a joy to use and provide a much larger, clearer, detailed view than you get from the displays mounted on the backs of these same cameras.  I realize this would require a hardware change but oh, what a view that would be!

Along these same lines, could there be a way to make use of the full uncropped image in the eyepiece?  With relatively large targets, the uncropped view on the phone’s display is much more expansive and enjoyable than the much constrained circular cropped view.  Could there be a way to present the full uncropped rectangular view and allow it to be rotated in the eyepiece to deal with changes in the telescope’s orientation?

  • Send Observation Parameters: Given the eVscope’s ability to participate in “citizen science” observations and data collections, it seems like there should be a more direct way to send observation parameters (like RA/Dec coordinates and exposure/gain settings) to the scope from an emailed observation request.  Perhaps encoded in a URL that’s interpreted by the Unistellar app?  It’s kinda silly that you have to transcribe lengthy coordinates from an email on the phone to the Unistellar app on the phone.  You can’t even copy/paste right now!

Issues / Bugs

As of February 2020, these are the issues I’ve been seeing with my current version of the Unistellar app (version 0.9) for iOS 13 (iPhone XS, iPad Pro). I’ve already heard back from Unistellar that many of these issues will be addressed in their upcoming 1.0 release of their software.  I’ll strike these out as I find they get addressed:

  • The app will immediately crash/exit when you return to the app after being disconnected from the scope or wifi or after having to leave the app for some reason and come back.  The app will also occasionally crash/exit for other unknown reasons in the midst of using it, but I haven’t tried to maintain a list of each circumstance.  Hopefully you’re receiving the iOS crash reports from Apple.
  • I’m seeing a patch or trail of green pixels on most images in the same place.  I’m guessing I’ve got a hot/stuck pixel on my image sensor and the random walk pattern of the pixel is just the effect of combining many images as the field shifts and rotates while tracking the target.  Is there support for subtracting out hot/stuck pixels?
  • Often the app will forcefully halt the enhanced vision mode without warning and without a chance to save what you have so far – it just returns to the live view.  Sometimes there is no message at all and other times there will be an error message like “too bright” even when it appears there is still much more light that could be captured over most of the frame: only a couple of bright stars in a large field of dim nebula, before the nebula has even really become visible.  Please don’t forcefully stop!  (Also, how is it that we were instructed to leave enhanced vision mode running for hours during the recent exo-planet transit when I have had it quit after just 10 minutes or so on a nebula??)
  • The gesture to pinch zoom is buggy and at times it jumps around or refuses to stick.  Strangely, at other times, it works just fine.  I haven’t picked up on a pattern as to when it doesn’t work.
  • Adjusting the contrast/brightness controls while enhanced vision running usually results in the slide control jumping back to its previous position while it completes current frame, and only then it jumps to where you set it.  If you don’t realize it’s going to do this, you’ll try to move it again and again and only get even more confused as to why it keeps jumping around.  It needs to at least stick in the new position even though the next frame is still being generated.
  • Once I learned how to use the download/upload buttons (no documentation until I received that shared observation invitation email), I found that both the download and upload sequences would sometimes stop processing after many minutes and I would have to quit and restart them.  This happened several times (4-5 maybe?).
  • Why are the download/upload functions so incredibly slow??  Even over local WiFi from the telescope to the phone?  How many gigabytes could that possibly be? I don’t have enough storage on my phone for it to take that long. Is there a bug here?
  • On one occasion, the joystick/slew buttons seemed to stop working but after quitting and relaunching the app I found that it had actually slewed but had apparently stopped updating the displayed view.
  • On another occasion, the joystick slew buttons stopped working and the scope view began shifting randomly.  Had to quit/relaunch the app to fix it.
  • Another time the app got stuck in a goto/skewing operation and none of the buttons worked any more and I couldn’t select another target.  The telescope seemed to be no longer tracking the sky, the star field just drifted in the display.  Force-quitting the app didn’t help.  I had to power down the telescope and restart it.
  • Sometimes lots of large random distortions and smears of light and color appear in the display. (I’m not talking about while slewing which would be expected.)
  • On a couple of occasions, the “goto” catalog list would jump or reset its scroll position while trying to scroll through it making it difficult to select the desired item.  Usually it’s fine – I haven’t figured out when this happens yet.
  • Seems like the app or the telescope gets confused if you exit the app while enhanced vision mode is engaged.  Are you supposed to be able to exit and come back while enhanced vision is in progress?
  • Please rework how those three sequential messages work asking whether to automatically save the image after running enhanced vision.  It was frustrating trying to get it to just prompt me to save the image without also asking me two follow-up questions every time.
Share
Also tagged , , | Comments Off

Unistellar eVscope

I recently received my eVscope from Unistellar and after just a few sessions with it, I thought I would share my thoughts and experiences with it so far – particularly since there wasn’t a lot of info available when I ordered it in back in July of 2019.

Overview

The Unistellar eVscope is quite different from a traditional optical telescope.  It’s a highly integrated and automated digital imaging telescope that enables you to easily find and view deep sky objects in color and detail that would not normally be perceptible to your eye looking through a normal optical telescope.  In addition, the eVscope is designed to let you easily participate in and contribute data to crowd-sourced “citizen science” projects.

The eVscope is a 4.5-inch Newtonian reflector that captures light on a highly sensitive, low noise Sony IMX224 color sensor while using a motorized alt-az tracking mount and autonomous field detection to automatically identify, align and continually track its view of the sky.  Integrated image-processing software takes and combines an on-going series of short exposures to generate an image in almost real time that brings out much of the very low light, color and detail that’s not visible to the human eye even when looking through a normal telescope. This view accumulates over just seconds and minutes and is displayed both in the telescope’s eyepiece (on an OLED display) as well as on a WiFi-connected smartphone.  The whole thing is self-powered via an integrated 9-10 hour rechargeable battery, fits into a large backpack and weighs just under 20 lbs. including the provided tripod.

In other words, it’s quite an impressive level of integration!

Continue reading »

Share
Also tagged , | 0 Comments

Check Your Old Sunglasses and Goggles

We were just about to toss out some old goggles that were in the closet and, on a lark, decided to check their UV protection.  Mine were maybe 10+ year old Oakleys (haven’t used in many years) and Darlene’s were her old Smith’s that she had been using up until last month.  Both are fitted with orange “high contrast” lenses.

Well, both appear to be failing to provide UV protection now.  Not good!

Sunglasses and ski goggles can lose their UV protection over time.  So… double-check your old sunglasses and goggles for UV protection!

I tested with a UV flashlight (“black light”) I picked up recently off Amazon when I decided I didn’t want to trust the unknown Chinese manufacturer (“Oho”) of some new camera goggles I bought.  As it turns out, those new goggles and my old Liquid Image camera goggles I’ve been using for many years pass the UV flashlight test fine.  (As do my and Darlene’s sunglasses.)

You can do a quick and dirty test just using a $20 bill and one of these inexpensive UV flashlights.  You can also get a more professional test (with an actual UV blocking measurement) from your local optician.

Here’s a bad result on left (strip is fluorescing due to UV light getting through lens), good result on right – both are orange tint lenses and many years old:

Share
0 Comments

Fire and Smoke

Lots of wildfires in California lately and over 100 in the Santa Cruz area this year so they’ve closed many of the county parks to try to reduce the risk.  This one a couple of weeks ago (the Rincon fire) was quite visible from my place but happily they were able to get it under control in a couple of days:

 

After seeing someone test the air filtration of Tesla’s Model X and its “biodefense mode” against the heavy smoke we’re getting from our wildfires this month, I decided to pick up an inexpensive air quality sensor to test my home’s air as well as my Model 3’s more mundane filtration system.  (The Model 3 doesn’t have the Model X’s fancy “biodefense mode” or huge HEPA filters.)

With the PM2.5 sensor reading 150 μg/m3 (unhealthy) in the San Jose area (due to smoke from the Camp Fire that burned through Paradise, CA), I found that the Tesla Model 3’s air filter would bring things down to the 20’s in the cabin in just a few minutes when recycle air was turned on.  Later, I stopped and made a video to record it falling from 135 to 5 μg/m3 in less than 10 minutes. It climbed back up to the 80’s pretty quickly though when I turned off recycle air and let it bring in fresh air:


This video was even picked up by Teslarati (“Model 3 protects owner…“) and re-tweeted by Elon.

Hi, Elon!  But they didn’t pick up on my follow-up test to compare the Tesla to a Toyota:

Comparison with Toyota RAV4 EV

I decided to repeat the test with my 2013 Toyota RAV4 EV.  This time the starting air quality wasn’t nearly as bad as my initial test but both the Tesla and the Toyota were able to filter the cabin down to a reading of zero from a start of 50 μg/m3 with recycle air turned on. At full fan speed, the RAV4 took about 10 minutes and the Model 3 was able to do it in about 3-4 minutes.

With recycle air turned off (fresh air intake on), the PM2.5 reading in both cars climbed up again. The Tesla was able to hold it around the low to mid 30’s but the RAV4 went up to essentially the outside reading of 50 μg/m3 again.  So the Model 3’s system does work better.

One other thing of note is that the RAV4 ended up with a much higher concentration of TVOC (total volatile organic compounds), even though the vehicle is five years old. Presumably this is off-gassing of some of the materials in the cabin.Oh, and I forgot to turn off A/C in the RAV4 for the test – hence the temperature drop.

Here’s more detail in screenshots – RAV4 start and finish with recycle on:

Image Image

Model 3 start and finish with recycle on:

Image Image

On a subsequent four-hour drive to Tahoe in the Model 3, I encountered much worse air along the way (San Jose, Central Valley, Sacramento, etc). I’d guess the PM2.5 count was easily at least 150 μg/m3 and probably much higher in places, but I avoided opening the windows to test it. I kept the air on recycle and saw that the particle count held down around 20 but sometimes climb to the 30’s. Not bad, given how bad it was outside.

Share
Also tagged , | 0 Comments

Maker Faire 2018

Darlene and I made it out to Maker Faire again this year:

Click through for the full gallery:

    

Share
Also tagged , , | 0 Comments

First Launch of the Falcon Heavy

On Tuesday, February 6th, SpaceX successfully launched their Falcon Heavy rocket on its inaugural flight, sending Elon Musk’s original Tesla Roadster and “StarMan” on a far reaching orbit around the sun as a test payload.  Happily, I was able to fly out to Florida and experience the launch firsthand from the Kennedy Space Center’s closest available viewing location for the general public – just 3.9 miles away from the launch platform!  (It’s just too bad they haven’t removed the historic-but-no-longer-needed launch tower at LC-39A, as it was sitting between us and the Falcon Heavy.)  Still, it was quite the show with essentially three of their Falcon 9’s strapped together and all twenty-seven engines firing simultaneously!  Not to mention the amazing, never-seen-before, simultaneous return of the two outer boosters back to the nearby landing zone!

I’ve made a video of what it was like to watch (and hear) from our vantage point:

The Falcon Heavy launch as experienced from the closest public viewing area

This viewing location is part of Kennedy Space Center’s “Feel the Heat” ticket package which takes you to the Apollo/Saturn V Center to view a launch and includes a buffet, some commemorative items, and return entrance to the Kennedy Space Center on a later date to enjoy the rest of the exhibits.

You’re given an assigned arrival time some 5-6 hours before the launch to catch your bus (and told not to come earlier) but for this historic event, there were so many people that it took hours to get through the security gates, boarded on a bus (really? loading the buses serially??) and delivered to the viewing area.  By the time we unloaded from the buses at the viewing area, there was slim-pickings for anywhere on the grounds to set up a tripod with a good, unobstructed view because apparently many folks had shown up an hour or more earlier.  Anyway, I staked out a spot between others some three hours before the scheduled launch but had to skip the buffet to keep watch over all my gear.

The launch ended up being delayed several times due to high altitude wind shear and we were all getting a little nervous that they’d miss their launch window for the day (1:30 pm – 4 pm) as they rescheduled all the way up to 3:45 pm.  But then, about an hour before that, they made the call to go ahead and start fueling the liquid oxygen – meaning a go for launch!  Hurrah!

And then 5, 4, 3, 2, 1, …. and great clouds of steam erupted with 27 engines firing – quite the sight!  We couldn’t see the rocket until it cleared that annoying (and unneeded!) tower, but afterwards the light intensity of the exhaust was incredible as it climbed into the sky.  You hardly notice the absence of sound from the rocket with the cheers of the crowd around you, but a few seconds later it starts to come across – and it’s an amazing, stuttering roar.

Then you get to watch it climb and roll and, higher up, begin to build a beautiful column of vapor – which it eventually disappeared into.  After a bit, it reappeared further east as a faint set of exhaust plumes still coursing away.  On the monitor, we could watch and hear announcements of each successful milestone and cheers would erupt each time – like with the separation of the side boosters and their retro-firing to return to Cape Canaveral.

Minutes later the two side boosters appeared in our sky coming down at incredible speed.  We all lost track of them though when they cut their engines again and unfortunately many of us weren’t in a position to see them again when they reignited for their final deceleration over their landing targets. We could of course see the video feed on the monitors, perfectly landing themselves (vertically!), like something out of science fiction – but it wasn’t until after they had landed that their twin sonic booms reached us.  We all of course learned later that the center core didn’t fare so well because two of the three needed engines were unable to restart (not enough ignition fuel) and it crashed into the ocean close enough and hard enough to damage the autonomous drone ship that was waiting for it.  But hey, this was a test flight!

 

The Falcon Heavy is now the most powerful rocket in the world, with the most lifting capability – though it will soon be surpassed by NASA’s upcoming “SLS” rocket as well as SpaceX’s own future “BFR”.

Meanwhile, “Starman” continues his/her epic journey in space:

Click through for my full photo gallery from the launch and my follow-up visit to the Kennedy Space Center:

      

Here are links to more videos of the first Falcon Heavy launch:

Share
Also tagged , | 0 Comments

A Flying Camera

2.9 minute video demonstration of the Mavic Pro (223 MB)

This is a short little video montage of my first few flights around my house with DJI’s Mavic Pro – a fantastic, compact little flying camera platform.  This thing folds down to about the size of a quart-size water bottle, weighs only 2 lbs with a battery and flies for about 25 minutes per charge.  It’s got a tiny gimbal-stabilized 4K camera that can capture up to 4096×2160 video.  (The video above is downgraded to 1280×720 but here’s a short snippet of 4K footage.)  The Mavic Pro has lots of sophisticated smarts on board too: automatic return to home, obstacle avoidance, vision positioning system, object tracking/following/circling, etc.  It maintains a live high definition feed to your phone/controller wth a range of over 4 miles, though FAA rules require that you maintain visual line-of-sight and stay below 400 ft from ground level at all times.  (Also, drones and other remote-controlled aircraft cannot be used in national parks, wilderness areas, ski resorts, around crowds or events, etc. without special permission.)

The Mavic Pro is very fun and easy to fly and it’s amazing how clear and stable the video footage is, even enough to use it as a flying tripod or do time-lapse photography.  Best of all it folds down so nicely to fit easily into a small backpack or carrying case.  It’ll be fun to bring this along on some hiking and biking trips.

Here’s some additional footage – the first from nearby Wilder Ranch State Park, including trying out the Mavic’s “Active Track” flight mode and the second from just north of Pescadero Beach while looking for whales:

Wilder Ranch (50 seconds, 59 MB)

Near Pescadero (85 seconds, 110 MB)

Here’s DJI’s related SkyPixel site where you can see sample drone photography.  Here’s one of many reviews about DJI’s Mavic Pro, if you’re interested in more detail.  I’d recommend buying DJI’s “Fly More Combo Pack” which includes the Mavic Pro but also two extra batteries, two extra propellers, the four-battery charging hub, a car charger cord, an adapter for charging your phone or other USB device from a battery pack and the DJI  carrying case/shoulder bag.  You’ll also likely want to get a lens shade as the Mavic’s camera tends to easily catch sunlight even when not pointed at the sun.  This one works well, while this one is too fragile and breaks easily just mounting it.

And one more bit of footage – sneaking up on Darlene’s family while they were here visiting:

Drone Attack! (60 seconds, 22 MB)

Share
Also tagged , | 0 Comments

Maker Faire 2016

A bit of video from this year’s Bay Area Maker Faire:

Video montage of the 2016 Bay Area Maker Faire (5:35 minutes, 114 MB)

Share
Also tagged , , | 0 Comments

Riding the Segway

While visiting with Darlene’s family in Wisconsin/Minnesota, we went for a Segway ride and tour in La Crosse this past Sunday with Shel, Dan, Kathy and Shelly.  It was my first time trying one and it was a lot of fun.  The handling is very intuitive and responsive – to the point of being a little addictive!  If you have yet to try one, look for a tour or rental in your area (like La Crosse Segway Tours) – it’s definitely worth it!

Click through for the full gallery of pics and video:

  

A short, 75-second video montage from our Segway ride in La Crosse, WI.

Share
Also tagged , | 0 Comments

“Have you played Atari today?”

A little while ago, after reading “Ready Player One” again (Spielberg is making a movie!) and after seeing a couple of tech talks by old Atari game programmers, I was lamenting that I sold my old Atari VCS so many years ago.  Well, Darlene jumped on this comment, found a bundle someone was selling on eBay and surprised me with an early birthday gift.  Yup, an old Atari VCS/2600 (four switch version), a set of controllers and a bundle of game cartridges. Sweet!  (I think my brother and I actually had the six-switch, Sears-rebranded version, but still very cool!) Thanks, Darlene!

I immediately had to go fill out the set of 40 cartridges with a couple of other games I remember us playing a lot.  Of course then was the challenge of hooking it up: the Atari outputs an analog RF TV signal… on an RCA-plug cable.  You can use an adapter like this one to go from RCA plug to coax TV cable input.  I don’t have a TV tuner, so rather than pulling a VCR out of a box in a closet, I hooked it up via my old USB EyeTV tuner/video converter to my MacBook – success!

IMG_4059 IMG_4052

Yeah, you can play any of these games via emulation on a modern computer, or even a smartphone/iPad, but there’s something very different about jamming the physical cartridge into the old physical console and handling that classic Atari joystick.  (And having to use cotton swabs and alcohol to clean the contacts on all of the Activision cartridges to get them to work again!)

It’s been fun to pick these up and rediscover old visual/procedural memories, like the admittedly-simple path through the Adventure maze.  Some titles are only vaguely familiar until you plug them in and see the game again and then go “aha!!”

So… to paraphrase Atari’s old marketing… have you played your Atari today?

to_be_continued

Share
Also tagged , , | 0 Comments

Spruce Goose

While up in Portland, Oregon this past weekend for my brother’s birthday, Glenn, Michele and I made a day’s excursion to the Evergreen Aviation and Space Museum, the current home of the “Spruce Goose” and a huge variety of other aircraft.  All pretty cool and definitely a worthwhile visit, but it’s a little annoying that they charge extra (and separately) for tours inside two of the aircraft: a B-17 bomber (“Flying Fortress”) and the Spruce Goose itself.

Click through for pictures:

   

Share
Also tagged , , , , | 0 Comments

Robot in the House

For years I’ve dismissed those little semi-autonomous, robotic sucking machines.  It sounded like they weren’t really worth the trouble since they couldn’t really run for very long, pick up much debris in their tiny compartments, deal with furniture without missing spots or getting stuck or trapped. With all the need to supervise, it sounded easier and quicker to do it yourself.  But then recently I stumbled on a review of a new model and was intrigued by the improvements and the possibility of a little machine to help keep up on all that cat hair my two furry friends are always producing.

 A fun video showing my new little helper in action (1.5 minutes, 24 MB)

It’s the BotVac 80/85 from Neato.  Unlike its more well-known competitor (iRobot Roomba), this robotic vacuum cleaner does not just follow a random walk around the room, bumping haphazardly from one obstacle to the next.  The BotVac uses laser sensors to map out the shape of each room and build up a floor plan as it goes about its business.  When it encounters obstacles like tables and chairs, it will actually work to navigate around each leg, vacuuming under and around as much as it can.  It’s pretty amazing (and mesmerizing) to watch it navigate around the house, room after room, following its little internal rule sets to deal with various obstacles as they come up.

  • When the BotVac gets low on charge, it will actually backtrack through the map it built to return to its charging base and dock itself for recharging, even off in another room.  And when it has finished recharging it will return by itself to where it left off and continue the job!
  • It’s got touch sensors in front to help it maneuver tightly to objects and walls.
  • It has a sensor underneath to keep it from running off a cliff (or stairs).
  • It comes with some magnetic strips that you can lay down on the floor to cordon off rooms or areas that you don’t want it to intrude on.  (It’s much simpler than the battery-operated “fence” posts that the iRobot apparently uses.)
  • It has a little edge-cleaning brush on the right side.  (Thus it will always approach walls and make its rounds in a right-handed path.)
  • It’s squared off in front so that it can get into corners much better than fully round designs like the Roomba.
  • It has a larger-than-typical dust bin and it’s very easy to remove and empty out – without even having to turn over the unit.  It makes sense to also vacuum out the dust filter though.
  • You can set a schedule for when it should run but this doesn’t seem practical to me as I would first want to clear stuff off the floor and make sure there aren’t any cat messes that it would get into – and make worse.  (Hera often has stomach issues.)

It’s not quite a replacement for a full-size vacuum cleaner but it certainly does an amazing job considering that you can just start it up and let it go while you go about doing other things.  (You also do need a normal vacuum cleaner to clear out its filter.)  It’s pretty cool though to come back and find everything freshly vacuumed!  And it’s not really that loud (certainly much less than a full size vacuum) and it’s not too annoying to have it going about it’s business while you do other things.

One limitation with the BotVac is that at about four inches tall, it can’t fit under some furniture, particularly couches.  (The Roomba design has a lower profile and can fit under more furniture.)  Also, the BotVac can get itself stuck at times and need help.  This happens sometimes with furniture that offers just enough clearance for it to partially slip under but not quite enough for it to fit entirely under.  Often this goes fine and it will just work its way around, but other times it’ll get itself wedged in and need to be pulled out.  When it does get stuck or trapped, it will cut power to its vacuum and call for help by chiming.  It’ll then sit and wait quietly for a while before chiming now and again.

Here’s a much more mixed review of the BotVac that comes out in favor of the Roomba.  Some more reviews: BotVac 85 vs. Roomba 880 (favors the BotVac) and iRobot Roomba vs Neato Botvac (favors neither).

Note that the BotVac 85 is really just the same model as the 80 but it comes with two extra filters included.   (This wasn’t obvious to me.)  Both the 80 and 85 come with the two different brush types.

Now… what should I name him?

UPDATE (1/28/2015): The BotVac is still running but I have seen more of its deficiencies.  One thing that happens is that it essentially becomes a little senile with a low battery charge: it often has become unable to find its way back to its charge station with its battery runs low.  It will repeatedly and aimlessly search a small area (a couple of square feet) and after a long while finally give up and call for help – this without any obstacles in the way.  My guess is that it lets the voltage level drop too far on the battery now and is unable to sufficiently power its electronics and sensors.  At first it only happened occasionally, then it started happening almost every time.  But then, more recently (April 2015), it’s been working properly again! Weird. Anyway, when it does “go senile”, I have to pick it up and manually dock it at its charge station.  (If I let it continue its search for the dock right in front of it, it will just wander off again.)

The other issue (and this is more annoying) is that its methodical method of covering a room means that it will get into try over and over again (unsuccessfully) to reach some particularly difficult spot (due to furniture) and waste a lot of its battery charge or even eventually get itself wedged in or otherwise stuck.  Bringing it back out again will often lead to it just finding its way right back into that spot.  I’ve since got into the habit of leaving some strategically placed pillows or other items to prevent it from getting into those spots.  This is where I imagine the Roomba might do better with its random walk pattern: it probably won’t get stuck obsessively trying to reach the same spot.

Lastly, as I mentioned earlier, the biggest problem with the BotVac is the little laser assembly sticks up in the center of the unit. This protrusion isn’t accounted for when the unit tries to go under some furniture so it can end up wasting energy trying over and over to get under some furniture or even getting wedged under such furniture.

However, the BotVac does still do a good vacuuming job and it’s great to be able to set it off running while you take care of other things.

Share
Also tagged , , | 0 Comments

Pivothead Sunglasses Camera

I’ve been trying out a new pair of sunglasses with a built-in video/photo camera, the Pivothead Recon.  The Recon (actually now called the Kudu) is one of several styles of camera glasses from Pivothead.  The glasses can record video at 1080/30 or 720/60 fps as well as take still photos (up to 8 MB).  They can even capture stills while you’re recording video.  They have interchangeable shades, including the photo-chromatic kind (adjusts to brightness).

The camera functionality works pretty well except that they currently have some issues with their various focus modes.  The continuous focus mode hunts for focus a lot, the fixed focus mode is set to a focus point that’s too close so most of the time everything is very softly focused, but I’m getting the best results with the auto-focus mode which sets a focus when you start recording and holds that for the duration of the recording.

The exposure isn’t always ideal but then it’s pretty amazing that they can cram all this functionality in a sunglasses frame (rather than having a bulky camera mounted on your helmet).  Another issue though is that the little LED lights on the inside of the frame aren’t really visible while you’re wearing them so you have to pull them half off to verify that they’re on and/or recording.  Given that a) you can’t start recording until a couple of seconds after you hit the power button and b) they automatically shut off when idle after 30 seconds or so, it’s easy to think you’ve started recording, when you haven’t.  (This keeps happening to me.)  It would be better if the little rocker switch to start/stop a video or take a photo would actually power them up, rather than having a separate power toggle button.

IMG_0091

Trying out the Pivothead Recon Sunglasses/Camera

The glasses aren’t very adjustable for different faces.  Mine tend to sit high on my nose and the camera points up a little high but this ends up working out to a good angle for mountain biking.  Yes, that picture was taken using the glasses, but I titled my head down (and still heavily cropped it to frame it well).  Without doing that, the center of the shot would actually be well above my head.  But as I say, that has to do with individual fit and it works out fine for me while on a bike.

The Pivothead charges via USB and you can get a combination external battery pack and WiFi hub (Pivothead Air Sync) that allows you to charge it up and download your shots when you’re out in the field.  It’s also useful as extra power for any USB-chargeable device.

Here’s some sample video showing some of its strengths and weaknesses.  (Also, this video was shot with these glasses as well.)  Both videos have of course been downsized and compressed for web presentation:

 

Note the challenges in dark, high contrast lighting in the trees and how quickly (or slowly) it can adjust to changing lighting.  Obviously it does much better in brightly lit scenes.  Also, it would be awesome to have some optical stabilization but that would be asking a lot at this point, particularly in the frame of a pair of sunglasses.  Really I just hope they can fix the other focus modes and improve the start/stop method.

Share
Also tagged | 0 Comments

Maker Faire 2013

Pictures and a bit of video from a day at the Bay Area Maker Faire:

 

Share
Also tagged , | 0 Comments

Space Shuttle Flyover

I went down to the NASA Ames Research Center / Moffett Field this morning to watch the flyover of the space shuttle Endeavour on its way to a museum in southern California.

P1010102 P1010106 P1010114 P1010112

There were a number of booths set up showing some of the science and technology developed at Ames to support the shuttle program as well as a number of guest speakers including a couple of shuttle astronauts. Unfortunately, the host speaker built up expectation a bit much by describing how the shuttle and its 747 carrier were expected to come down the length of the runway potentially as low as 200 ft.  As the supposed 20,000 of us were gathered along the length of the runway, this would have been quite spectacular to see.  Alas, the pilots clearly had other plans.

P1010121 P1010120 P1010119 P1010127

After we were told the shuttle was approaching, it was the escort jet that became first visible and we were all watching *it* as the shuttle itself made a stealthy approach hidden behind the large hanger frame on the opposite side to the runway.  It was nearly on top of us when it popped into view and everyone turned (and hastily swung their cameras around) to see it fly over — at a more mundane 1500 ft or so.

Here’s a short video I created of the event:

Still it was fun to get to see it with the big crowd and to hang out with others while waiting for its appearance!  I was there with the Geek Club Meetup group.

ClipFrame2 ClipFrame4 ClipFrame5

Here’s a great time-lapse video of the shuttle being maneuvered along the streets of Los Angeles.

Share
Also tagged | 0 Comments