So You Want to Make a Space Video

OpenSpace and Movie Production

Dr. James Hedberg
City College of New York
Dept. of Physics
Director of CCNY Planetarium

Why?

Ancient Sci-Viz


Ancient Visuals


Babylonian Tablet (1900-1700 BC), Dunhuang Star Map (700 AD), Nicole Oresme (~1300s)

Copernicus's hand drawn system of the world.

From the original manuscript of De Revolutionibus

Close up: Note the hole in the center where the compass needle has poked through the page.

Descartes

descartes

descartes-instrument

          
          flowchart TD
            A[Curve drawing with sticks] --> B[Analytic geometry];
            B --> C[x,y,z];            
        

Rule #14: The problem should be re-expressed in terms of the real extension of bodies and should be pictured in our imagination entirely by means of bare figures. Thus it will be perceived much more distinctly by our intellect.

Rule #15: It is usually helpful, also, to draw these diagrams and observe them through the external senses, so that by this means our thought can more easily remain attentive.

Descartes, Rules for the Direction of the Mind, 1625

Michael Faraday - Iron Filing diagram. 1851

Credit: Royal Institution

faraday-notebook

Visuals for Learning

sentence diagram sentence diagram

https://www.popchartlab.com/products/a-diagrammatical-dissertation-on-opening-lines-of-notable-novels

Why Only Us, Language and Evolution By Robert C. Berwick and Noam Chomsky 2015

The Periodic Table

sentence diagram

By Armtuk - Own work, CC BY-SA 3.0, Link

Lewis Structures

Physics


Biology

The Moving Image

Galileo and the Telescope

jovian moons

galileo moon sketches

jovian moons

Sidereus Nuncius Jupiter observations

Assembled from a source courtesy of The Linda Hall Library of Science, Engineering & Technology

Title: From "Occhiale" to printed page: the making of Galileo's "Sidereus nuncius" Authors: Gingerich, O. & van Helden, A. Journal: Journal for the History of Astronomy (ISSN 0021-8286), Vol. 34, Part 3, No. 116, p. 251 - 267 (2003)

Passage de Venus, Pierre Jules César Janssen, 1874

Ludwig Münch, Darmstadt, Animated Geometry ~ 1911

Origins of scientific cinematography, © IWF / CNRS / Luce / V. Tosi - 1992

The Legacy

Hand Painted frame from Le Voyage dans la Lune

Georges Méliès, ~1902

We have come a long way since the earliest efforts 100 years ago.

Destination Moon, (1950)

Universe, NFB (1960)

2001 frame

2001: A Space Odyssey (1968)

Force, Mass and Motion

by F. Swinden, AT&T / Bell Labs, 1968

OpenSpace

Strengths

Realtime Visualization

With SGCT, output format options are very diverse

Data driven

Chasm in Gale Crater on Mars

Phobos Eclipse on Mars - Simulation vs. Camera

Mercury - 3 years of celestial motion

Plotting Eagle Tracking Data

Getting Started with Video Output

  1. Simple Screen Capture
  2. Output to OBS
  3. Session Recording
  4. API control

Method One: Session Recording

Use the normal navigation controls to control the camera. Record your motions, then replay with Output Frames.

This method is great for general, cinematic style views.

Method One: Session Recording

The limitations are primarily your skill as a pilot, and an inability to change properties while flying since we only have so many fingers and hands.

Method One: Session Recording

There are also little issues that can be annoying. Eg: tile Loading times. Even when referencing local data, there can be noticeable jumps in terrain as higher resolution tiles are loaded in.

Tips

1. Customize your window sizes for the media you want to make: Use these in openspace.cfg

        
-- VERTICAL VIDEO
SGCTConfig = sgct.config.single{675,1200,res={1080, 1920},vsync=false}

-- SQUARE
SGCTConfig = sgct.config.single{1080,1080,res={1080, 1080},vsync=false}
        
      

(Or make custom sgct configurations)

Tips

2. Use defined folders to save your images.

        
openspace.setScreenshotFolder(openspace.absPath("${USER}/screenshots/description"));
        
      

Record the Session

Render Frames

The images all in a folder

Now, all the images are in one folder, and are numbered sequentially.

Import the sequence into Premiere Pro (or whatever you want to use as a video editor.)

Select an Image Sequence on import

Now the images are one sequence

Now you can edit your video.

Fancy Pants

Layers

Now that the session is recorded, you can turn things on/off during various render passes and make layers to play with.

  1. Label Layer
  2. Trails Layer
  3. Date Time Layer

Layers

Layers

Layers

Customize Date Dashboard

Render just the date, and now you have a layer.

Method Two: Programmatic Control

"Every day at noon, the sun..."

"If you look to the southern skies about 40 minutes after sunset this week, you'll see..."

"Planets in retrograde motion move backwards with respect to the stars..."

seconds > J2000 DateTime
snapshotdate array
789022870.0 01 January 2025 17:00
789109270.0 02 January 2025 17:00
789195670.0 03 January 2025 17:00
789282070.0 04 January 2025 17:00
789368470.0 05 January 2025 17:00
789454870.0 06 January 2025 17:00
789541270.0 07 January 2025 17:00

Code excerpt

  
  for i, value in enumerate(snapshotdate):
    timestamp = str(value)
    
    message = json.dumps({"topic": 4,
                  "type": "luascript",
                  "payload": {"function": "openspace.time.setTime",
                              "arguments": [timestamp],
                              "return": False}})
    ws.send(message)
        
    print (".", end="", flush=True)
    
    time.sleep(0.1) 

    message = json.dumps({"topic": 4,
                  "type": "luascript",
                  "payload": {"function": "openspace.takeScreenshot",
                              "arguments": [],
                              "return": False}})
    ws.send(message)
    
    time.sleep(0.1)
  

Where is the Sun at Noon?

By using a script to control the datetime, you can record the same sequence from different camera positions.

Get Sunset times

  
  import ephem
  import datetime

  gc = ephem.Observer()
  gc.lat, gc.lon = '35.9739', '-113.7689'
  gc.pressure = 0
  d = ephem.Date('2025/1/9')
  numberOfDays = 14
  sunrises = []
  sunsets = []

  for i in range(0,numberOfDays):
    gc.date = d+i
    tograbSunrise = ephem.to_timezone(gc.next_rising(ephem.Sun()), ephem.UTC)+datetime.timedelta(minutes=10)
    sunrises.append(tograbSunrise)
    tograbSunset = ephem.to_timezone(gc.next_setting(ephem.Sun()), ephem.UTC)+datetime.timedelta(minutes=1)
    sunsets.append(tograbSunset)
  
  

Prepare String for OpenSpace

    
    alltimes = []

    for i in range(0,numberOfDays):
      tograb = sunsets[i]
      timeandday = tograb.isoformat().replace('+00:00','')
      alltimes.append(timeandday)
    
  

Take the pictures

    
  number_of_photos = len(alltimes)
  for i in range(0, number_of_photos):
    timestamp = alltimes[i]
    message = json.dumps({"topic": 4,
                  "type": "luascript",
                  "payload": {"function": "openspace.time.setTime",
                              "arguments": [timestamp],
                              "return": False}})
    ws.send(message)
    time.sleep(0.5) 
    message = json.dumps({"topic": 4,
                  "type": "luascript",
                  "payload": {"function": "openspace.takeScreenshot",
                              "arguments": [],
                              "return": False}})
    ws.send(message)
    time.sleep(.5)
    
  

Simple GIF for 2 weeks of skywatching

Here's the result. It shows the positions of Venus and Saturn every night for 2 weeks at sunset time in the grand canyon.

Now, with layers, you can be the director of cinematography.

And, there are many other knobs to turn. (This one ramps the FOV from 90°-25° during the 2 weeks.)

Field of View

Adjust FOV during a sequence.

    
  fovSetting = []
  for i, value in enumerate(alltimes): 
    fovSetting.append(90-70*i/len(alltimes))

  number_of_photos = len(alltimes)

  for i in range(0, number_of_photos):
    timestamp = alltimes[i]  
    message = json.dumps({"topic": 4,
        "type": "luascript",
        "payload": {"function": "openspace.time.setTime",
                "arguments": [timestamp],
                "return": False}})
    ws.send(message)

    message = json.dumps({"topic": 0,
        "type": "set",
        "payload": {"property": "RenderEngine.HorizFieldOfView",
                  "value": fovSetting[i],
                  "return": False}})
    ws.send(message)
  
    
  

CONTEXT! - What if Manhattan was on the Moon?

Non-Linear Time

To highlight a particular event during a longer period of time, we can customize our $∆t$ values in the array.


totalimages = 600
intervals = []

a = 3550
b = totalimages/2
c = 110

for i in range(totalimages):
  intervalsCalc = -a*math.exp(-((i-b)**2)/(2*c**2) )+ 3600
  if intervalsCalc < 120 : 
  intervalsCalc = 120
  intervals.append(round(intervalsCalc,4))
    

Astronomical Time Scales

More scripting fun

Anything that you can click or change in the GUI, you can also control programmatically.

Textures

Here, we are changing the constellation art a few times per second with a script.

Parameters

This one has linked the star multiplier color to an audio track

Full Auto

A script updated the Earth with a new texture and ran through a day of rotation. Another script compiled them into the frame.

Experimental

Here we have the alt/azi of Mercury in a table. The camera is then rotated to always point at the position in the sky.


    message = json.dumps({"topic": 0,
      "type": "set",
      "payload": {"property": "Scene.CameraParent.Rotation.Rotation",
                  "value": [0,altitude,azimuth],
                  "return": False}})

    ws.send(message)
    

Here, we are tacking the position of Mars, advancing time, and adjusting the FOV.

Other formats/media

Full Dome


SGCTConfig = sgct.config.fisheye{1024, 1024}
  

VR


"projection": {
  "type": "EquirectangularProjection",
  "quality": "1k"
}
  

(Just make sure you have some extra drives.)

Ptolemaic Star Positions

Post Processing

Depth of Field

Jezero Crater - Depth of Field

Real cameras also have Depth of Field. We can simulate this by rendering single pass of the z-depth layer.

Using the z-pass (i.e. just a b&w image that shows the distance from the camera to the surface in a gradient), we can simulate a camera lens blur. Not too useful for most space applications, but does help with scenes of close range interest.

General Tips

  • Customize things!

    • Fonts: Mono = "${FONTS}/Bitstream-Vera-Sans-Mono/VeraMono.ttf"
    • Dates: ::UTC-h:m (from timout_c)
    • Labels / Localizations

Adjust the LOD (Level of Detail setting)

Changing the Level of Detail

Custom Labels

Parent a billboard to the object you want. Adjust size.

Starlink Launches over Time

Full Multimedia

Voyager Velocity Sonification

In Conclusion

Thank you!

So you want to write a fugue?

You've got the urge to write a fugue

You've got the nerve to write a fugue

So go ahead and write a fugue that we can sing

Pay no heed to what we've told you

Give no mind to what we've told you

Just forget all that we've told you

And the theory that you've read

For the only way to write one

Is just to plunge right in and write one

So just forget the rules and write one

Have a try, yes, try to write a fugue

So just ignore the rules and try

Glenn Gould, So You Want to Write a Fugue?