What exactly is a sun eclipse? Will I be able to see it and if so when from the Netherlands?
The solar eclipse is when the moon is directly in front of the Sun and creates a shadow on the Earth. They happen about once every 18 months. I don’t believe that you’ll be able to see this eclipse from the Netherlands. I think the next one to be in Europe is in 2026. There’s one in Chillie and Argentia in 2019 and another in Antartica in 2021.
If you look at your baby photos, you might see hints of the person you are today — a certain look in the eyes, maybe the hint of your future nose or ears. In the same way, scientists examine the universe’s “baby picture” for clues about how it grew into the cosmos we know now. This baby photo is the cosmic microwave background (CMB), a faint glow that permeates the universe in all directions.
In late September, NASA plans to launch a balloon-based astronomical observatory from Fort Sumner, New Mexico, to study the universe’s baby picture. Meet PIPER! The Primordial Inflation Polarization Explorer will fly at the edge of our atmosphere to look for subtle patterns in the CMB.
The CMB is cold. Really, really cold. The average temperature is around minus 455 degrees Fahrenheit. It formed 380,000 years after the big bang, which scientists think happened about 13.8 billion years ago. When it was first discovered, the CMB temperature looked very uniform, but researchers later found there are slight variations like hot and cold spots. The CMB is the oldest light in the universe that we can see. Anything before the CMB is foggy — literally.
Credit: Rob van Hal
Before the CMB, the universe was a fog of hot, dense plasma. (By hot, we’re talking about 500 million degrees F.) That’s so hot that atoms couldn’t exist yet – there was just a soup of electrons and protons. Electrons are great at deflecting light. So, any light that existed in the first few hundred thousand years after the big bang couldn’t travel very far before bouncing off electrons, similar to the way a car’s headlights get diffused in fog.
After the big bang, the universe started expanding rapidly in all directions. This expansion is still happening today. As the universe continued to expand, it cooled. By the time the universe reached its 380,000th birthday, it had cooled enough that electrons and protons could combine into hydrogen atoms for the first time. (Scientists call this era recombination.) Hydrogen atoms don’t deflect light nearly as well as loose electrons and the fog lifted. Light could now travel long distances across the universe.
The light we see in the CMB comes from the recombination era. As it traveled across the universe, through the formation of stars and galaxies, it lost energy. Now we observe it in the microwave part of the electromagnetic spectrum, which is less energetic than visible light and therefore invisible to our eyes. The first baby photo of the CMB – really, a map of the sky in microwaves – came from our Cosmic Background Explorer, which operated from 1989 to 1993.
Why are we so interested in the universe’s baby picture? Well, it’s helped us learn a lot about the structure of the universe around us today. For example, the Wilkinson Microwave Anisotropy Probe produced a detailed map of the CMB and helped us learn that the universe is 68 percent dark energy, 27 percent dark matter and just 5 percent normal matter — the stuff that you and stars are made of.
Right after the big bang, we’re pretty sure the universe was tiny. Really tiny. Everything we see today would have been stuffed into something smaller than a proton. If the universe started out that small, then it would have followed the rules of quantum mechanics. Quantum mechanics allows all sorts of strange things to happen. Matter and energy can be “borrowed” from the future then crash back into nothingness. And then cosmic inflation happened and the universe suddenly expanded by a trillion trillion times.
All this chaos creates a sea of gravitational waves. (These are called “primordial” gravitational waves and come from a different source than the gravitational waves you may have heard about from merging neutron stars and black holes.) The signal of the primordial gravitational waves is a bit like white noise, where the signal from merging dead stars is like a whistle you can pick up over the noise.
These gravitational waves filled the baby universe and created distinct patterns, called B-mode polarization, in the CMB light. These patterns have handedness, which means even though they’re mirror images of each other, they’re not symmetrical — like trying to wear a left-hand glove on your right hand. They’re distinct from another kind of polarization called E-mode, which is symmetrical and echoes the distribution of matter in the universe.
That’s where PIPER comes in. PIPER’s two telescopes sit in a hot-tub-sized container of liquid helium, which runs about minus 452 degrees F. It’ll look at 85 percent of the sky and is extremely sensitive, so it will help us learn even more about the early days of the universe. By telling us more about polarization and those primordial gravitational waves, PIPER will help us understand how the early universe grew from that first baby picture.
PIPER’s first launch window in Fort Sumner, New Mexico, is in late September. When it’s getting ready to launch, you’ll be able to watch the balloon being filled on the Columbia Scientific Balloon Facility website. Follow NASA Blueshift on Twitter or Facebook for updates about PIPER and when the livestream will be available.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
On May 19, 2022, our partners at Boeing launched their Starliner CST-100 spacecraft to the International Space Station as a part of our Commercial Crew Program. This latest test puts the company one step closer to joining the SpaceX Crew Dragon in ferrying astronauts to and from the orbiting laboratory. We livestreamed the launch and docking at the International Space Station, but how? Let’s look at the communications and navigation infrastructure that makes these missions possible.
Primary voice and data communications are handled by our constellation of Tracking and Data Relay Satellites (TDRS), part of our Near Space Network. These spacecraft relay communications between the crewed vehicles and mission controllers across the country via terrestrial connections with TDRS ground stations in Las Cruces, New Mexico, and Guam, a U.S. territory in the Pacific Ocean.
TDRS, as the primary communications provider for the space station, is central to the services provided to Commercial Crew vehicles. All spacecraft visiting the orbiting laboratory need TDRS services to successfully complete their missions.
During launches, human spaceflight mission managers ensure that Commercial Crew missions receive all the TDRS services they need from the Near Space Operations Control Center at our Goddard Space Flight Center in Greenbelt, Maryland. There, communications engineers synthesize network components into comprehensive and seamless services for spacecraft as they launch, dock, undock, and deorbit from the space station.
Nearby, at our Flight Dynamics Facility, navigation engineers track the spacecraft on their ascent, leveraging years of experience supporting the navigation needs of crewed missions. Using tracking data sent to our Johnson Space Center in Houston and relayed to Goddard, these engineers ensure astronaut safety throughout the vehicles’ journey to the space station.
Additionally, our Search and Rescue office monitors emergency beacons on Commercial Crew vehicles from their lab at Goddard. In the unlikely event of a launch abort, the international satellite-aided search and rescue network will be able to track and locate these beacons, helping rescue professionals to return the astronauts safely. For this specific uncrewed mission, the search and rescue system onboard the Boeing Starliner will not be activated until after landing for ground testing.
To learn more about NASA’s Space Communications and Navigation (SCaN) services and technologies, visit https://www.nasa.gov/directorates/heo/scan/index.html. To learn more about NASA’s Near Space Network, visit https://esc.gsfc.nasa.gov/projects/NSN.
Make sure to follow us on Tumblr for your regular dose of space!
Our Space Launch System (SLS) rocket is coming together at the agency’s Kennedy Space Center in Florida this summer. Our mighty SLS rocket is set to power the Artemis I mission to send our Orion spacecraft around the Moon. But, before it heads to the Moon, NASA puts it together right here on Earth.
Read on for more on how our Moon rocket for Artemis I will come together this summer:
How do crews assemble a rocket and spacecraft as tall as a skyscraper? The process all starts inside the iconic Vehicle Assembly Building at Kennedy with the mobile launcher. Recognized as a Florida Space Coast landmark, the Vehicle Assembly Building, or VAB, houses special cranes, lifts, and equipment to move and connect the spaceflight hardware together. Orion and all five of the major parts of the Artemis I rocket are already at Kennedy in preparation for launch. Inside the VAB, teams carefully stack and connect the elements to the mobile launcher, which serves as a platform for assembly and, later, for fueling and launching the rocket.
Because they carry the entire weight of the rocket and spacecraft, the twin solid rocket boosters for our SLS rocket are the first elements to be stacked on the mobile launcher inside the VAB. Crews with NASA’s Exploration Ground Systems and contractor Jacobs team completed stacking the boosters in March. Each taller than the Statue of Liberty and adorned with the iconic NASA “worm” logo, the five-segment boosters flank either side of the rocket’s core stage and upper stage. At launch, each booster produces more than 3.6 million pounds of thrust in just two minutes to quickly lift the rocket and spacecraft off the pad and to space.
In between the twin solid rocket boosters is the core stage. The stage has two huge liquid propellant tanks, computers that control the rocket’s flight, and four RS-25 engines. Weighing more than 188,000 pounds without fuel and standing 212 feet, the core stage is the largest element of the SLS rocket. To place the core stage in between the two boosters, teams will use a heavy-lift crane to raise and lower the stage into place on the mobile launcher.
On launch day, the core stage’s RS-25 engines produce more than 2 million pounds of thrust and ignite just before the boosters. Together, the boosters and engines produce 8.8 million pounds of thrust to send the SLS and Orion into orbit.
Once the boosters and core stage are secured, teams add the launch vehicle stage adapter, or LVSA, to the stack. The LVSA is a cone-shaped element that connects the rocket’s core stage and Interim Cryogenic Propulsion Stage (ICPS), or upper stage. The roughly 30-foot LVSA houses and protects the RL10 engine that powers the ICPS. Once teams bolt the LVSA into place on top of the rocket, the diameter of SLS will officially change from a wide base to a more narrow point — much like a change in the shape of a pencil from eraser to point.
Next in the stacking line-up is the Interim Cryogenic Propulsion Stage or ICPS. Like the LVSA, crews will lift and bolt the ICPS into place. To help power our deep space missions and goals, our SLS rocket delivers propulsion in phases. At liftoff, the core stage and solid rocket boosters will propel Artemis I off the launch pad. Once in orbit, the ICPS and its single RL10 engine will provide nearly 25,000 pounds of thrust to send our Orion spacecraft on a precise trajectory to the Moon.
When the Orion stage adapter crowns the top of the ICPS, you’ll know we’re nearly complete with stacking SLS rocket for Artemis I. The Orion Stage Adapter is more than just a connection point. At five feet in height, the Orion stage adapter may be small, but it holds and carries several small satellites called CubeSats. After Orion separates from the SLS rocket and heads to the Moon, these shoebox-sized payloads are released into space for their own missions to conduct science and technology research vital to deep space exploration. Compared to the rest of the rocket and spacecraft, the Orion stage adapter is the smallest SLS component that’s stacked for Artemis I.
Finally, our Orion spacecraft will be placed on top of our Moon rocket inside the VAB. The final piece will be easy to spot as teams recently added the bright red NASA “worm” logotype to the outside of the spacecraft. The Orion spacecraft is much more than just a capsule built to carry crew. It has a launch abort system, which will carry the crew to safety in case of an emergency, and a service module developed by the European Space Agency that will power and propel the spacecraft during its three-week mission. On the uncrewed Artemis I mission, Orion will check out the spacecraft’s critical systems, including navigation, communications systems, and the heat shield needed to support astronauts who will fly on Artemis II and beyond.
The path to the pad requires many steps and check lists. Before Artemis I rolls to the launch pad, teams will finalize outfitting and other important assembly work inside the VAB. Once assembled, the integrated SLS rocket and Orion will undergo several final tests and checkouts in the VAB and on the launch pad before it’s readied for launch.
The Artemis I mission is the first in a series of increasingly complex missions that will pave the way for landing the first woman and the first person of color on the Moon. The Space Launch System is the only rocket that can send NASA astronauts aboard NASA’s Orion spacecraft and supplies to the Moon in a single mission.
Make sure to follow us on Tumblr for your regular dose of space!
Space telescopes like Hubble and our upcoming James Webb Space Telescope use light not only to create images, but can also break light down into individual colors (or wavelengths). Studying light this way can give us a lot of detail about the object that emitted that light. For example, studying the components of the light from exoplanets can tell us about its atmosphere’s color, chemical makeup, and temperature. How does this work?
Remember the primary colors you learned about in elementary school?
Those colors are known as the pigment or subtractive colors. Every other color is some combination of the primary colors: red, yellow, and blue.
Light also has its own primary colors, and they work in a similar way. These colors are known as additive or light colors.
TVs make use of light’s colors to create the pictures we see. Each pixel of a TV screen contains some amount of red, green and blue light. The amount of each light determines the overall color of the pixel. So, each color on the TV comes from a combination of the primary colors of light: red, green and blue.
Space telescope images of celestial objects are also a combination of the colors of light.
Every pixel that is collected can be broken down into its base colors. To learn even more, astronomers break the red, green and blue light down into even smaller sections called wavelengths.
This breakdown is called a spectrum.
With the right technology, every pixel of light can also be measured as a spectrum.
Images show us the big picture, while a spectrum reveals finer details. Astronomers use spectra to learn things like what molecules are in planet atmospheres and distant galaxies.
An Integral Field Unit, or IFU, is a special tool on the James Webb Space Telescope that captures images and spectra at the same time.
The IFU creates a unique spectrum for each pixel of the image the telescope is capturing, providing scientists with an enormous amount of valuable, detailed data. So, with an IFU we can get an image, many spectra and a better understanding of our universe.
Watch the full video where this method of learning about planetary atmospheres is explained:
The James Webb Space Telescope is our upcoming infrared space observatory, which will launch in 2021. It will spy the first galaxies that formed in the universe and shed light on how galaxies evolve, how stars and planetary systems are born and tell us about potentially habitable planets around other stars.
To learn more about NASA’s James Webb Space Telescope, visit the website, or follow the mission on Facebook, Twitter and Instagram.
Text and graphics credit: Space Telescope Science Institute
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
The Nancy Grace Roman Space Telescope is NASA’s next flagship astrophysics mission, set to launch by May 2027. We’re currently integrating parts of the spacecraft in the NASA Goddard Space Flight Center clean room.
Once Roman launches, it will allow astronomers to observe the universe like never before. In celebration of Black History Month, let’s get to know some Black scientists and engineers, past and present, whose contributions will allow Roman to make history.
The late Dr. Beth Brown worked at NASA Goddard as an astrophysicist. in 1998, Dr. Brown became the first Black American woman to earn a Ph.D. in astronomy at the University of Michigan. While at Goddard, Dr. Brown used data from two NASA X-ray missions – ROSAT (the ROentgen SATellite) and the Chandra X-ray Observatory – to study elliptical galaxies that she believed contained supermassive black holes.
With Roman’s wide field of view and fast survey speeds, astronomers will be able to expand the search for black holes that wander the galaxy without anything nearby to clue us into their presence.
In 1961, Dr. Harvey Washington Banks was the first Black American to graduate with a doctorate in astronomy. His research was on spectroscopy, the study of how light and matter interact, and his research helped advance our knowledge of the field. Roman will use spectroscopy to explore how dark energy is speeding up the universe's expansion.
NOTE - Sensitive technical details have been digitally obscured in this photograph.
Aerospace engineer Sheri Thorn is ensuring Roman’s primary mirror will be protected from the Sun so we can capture the best images of deep space. Thorn works on the Deployable Aperture Cover, a large, soft shade known as a space blanket. It will be mounted to the top of the telescope in the stowed position and then deployed after launch. Thorn helped in the design phase and is now working on building the flight hardware before it goes to environmental testing and is integrated to the spacecraft.
Roman will be orbiting a million miles away at the second Lagrange point, or L2. Staying updated on the telescope's status and health will be an integral part of keeping the mission running. Electronics engineer Sanetra Bailey is the person who is making sure that will happen. Bailey works on circuits that will act like the brains of the spacecraft, telling it how and where to move and relaying information about its status back down to Earth.
Learn more about Sanetra Bailey and her journey to NASA.
Roman’s field of view will be at least 100 times larger than the Hubble Space Telescope's, even though the primary mirrors are the same size. What gives Roman the larger field of view are its 18 detectors. Dr. Gregory Mosby is one of the detector scientists on the Roman mission who helped select the flight detectors that will be our “eyes” to the universe.
Dr. Beth Brown, Dr. Harvey Washington Banks, Sheri Thorn, Sanetra Bailey, and Dr. Greg Mosby are just some of the many Black scientists and engineers in astrophysics who have and continue to pave the way for others in the field. The Roman Space Telescope team promises to continue to highlight those who came before us and those who are here now to truly appreciate the amazing science to come.
To stay up to date on the mission, check out our website and follow Roman on X and Facebook.
Make sure to follow us on Tumblr for your regular dose of space!
Jessica Meir dreamed of the day she would make it to space since the age of five. That dream became a reality on Wednesday, Sept. 25, 2019 as she left Earth on her first spaceflight – later floating into her new home aboard the International Space Station. Jessica lifted off from Kazakhstan in the Soyuz MS-15 spacecraft at 9:57 a.m. EDT (1357 GMT) alongside spaceflight participant Ali Almansoori, the first United Arab Emirates astronaut, and Oleg Skripochka, a Russian cosmonaut.
As an Expedition 61 and 62 crew member, Jessica will spend six months in the vacuum of space – conducting research on a multitude of science investigations and participating in several Human Research Program studies.
While Jessica’s new home is more than 200 miles over the Earth, she is no stranger to extreme environments. She studied penguins in Antarctica and mapped caves in Italy – both of which prepared her for the ultimate extreme environment: space.
Get to know astronaut and scientist, Jessica Meir.
For her Ph.D. research, Jessica studied the diving physiology of marine mammals and birds. Her filed research took her all the way to Antarctica, where she focused on oxygen depletion in diving emperor penguins. Jessica is also an Antarctic diver!
Image Credit: UBC Media Relations
Jessica investigated the high‐flying bar-headed goose during her post‐doctoral research at the University of British Columbia. She trained geese to fly in a wind tunnel while obtaining various physiological measurements in reduced oxygen conditions.
In 2013, Jessica was selected as an Astronaut Candidate. While training to be a full-fledged astronaut, she participated in three days of wilderness survival training near Rangeley, Maine, which was the first phase of her intensive astronaut training program.
In our astronaut office, Jessica gained extensive mission control experience, including serving as the Lead Capsule Communicator (CapCom) for Expedition 47, the BEAM (Bigelow expandable module on the International Space Station) mission and an HTV (Japanese Space Agency cargo vehicle) mission. The CapCom is the flight controller that speaks directly to the astronaut crew in space, on behalf of the rest of the Mission Control team.
Following a successful launch to the space station, NASA astronaut Christina Koch tweeted this image of Jessica and the crew on their journey to the orbital lab in a Soyuz spacecraft. Excitement was high as Christina tweeted, “What it looks like from @Space_Station when your best friend achieves her lifelong dream to go to space. Caught the second stage in progress! We can’t wait to welcome you onboard, crew of Soyuz 61!”
Follow Jessica on Twitter at @Astro_Jessica and follow the International Space Station on Twitter, Instagram and Facebook to keep up with all the cool stuff happening on our orbital laboratory.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com
What design steps do you take to make sure that the robot runs smoothly, without anything like sand getting in the gears and wires?
On July 23, 1999, NASA’s Chandra X-ray Observatory, the most powerful X-ray telescope ever built, was launched into space. Since then, Chandra has made numerous amazing discoveries, giving us a view of the universe that is largely hidden from view through telescopes that observe in other types of light.
The technology behind X-ray astronomy has evolved at a rapid pace, producing and contributing to many spinoff applications you encounter in day-to-day life. It has helped make advancements in such wide-ranging fields as security monitoring, medicine and bio-medical research, materials processing, semi-conductor and microchip manufacturing and environmental monitoring.
Two major developments influenced by X-ray astronomy include the use of sensitive detectors to provide low dose but high-resolution images, and the linkage with digitizing and image processing systems. Because many diagnostic procedures, such as mammographies and osteoporosis scans, require multiple exposures, it is important that each dosage be as low as possible. Accurate diagnoses also depend on the ability to view the patient from many different angles. Image processing systems linked to detectors capable of recording single X-ray photons, like those developed for X-ray astronomy purposes, provide doctors with the required data manipulation and enhancement capabilities. Smaller hand-held imaging systems can be used in clinics and under field conditions to diagnose sports injuries, to conduct outpatient surgery and in the care of premature and newborn babies.
MRI systems are incredibly important for diagnosing a whole host of potential medical problems and conditions. X-ray technology has helped MRIs. For example, one of the instruments developed for use on Chandra was an X-ray spectrometer that would precisely measure the energy signatures over a key range of X-rays. In order to make these observations, this X-ray spectrometer had to be cooled to extremely low temperatures. Researchers at our Goddard Space Flight Center in Greenbelt, Maryland developed an innovative magnet that could achieve these very cold temperatures using a fraction of the helium that other similar magnets needed, thus extending the lifetime of the instrument’s use in space. These advancements have helped make MRIs safer and require less maintenance.
X-ray diffraction is the technique where X-ray light changes its direction by amounts that depend on the X-ray energy, much like a prism separates light into its component colors. Scientists using Chandra take advantage of diffraction to reveal important information about distant cosmic sources using the observatory’s two gratings instruments, the High Energy Transmission Grating Spectrometer (HETGS) and the Low Energy Transmission Grating Spectrometer (LETGS).
X-ray diffraction is also used in biomedical and pharmaceutical fields to investigate complex molecular structures, including basic research with viruses, proteins, vaccines and drugs, as well as for cancer, AIDS and immunology studies. How does this work? In most applications, the subject molecule is crystallized and then irradiated. The resulting diffraction pattern establishes the composition of the material. X-rays are perfect for this work because of their ability to resolve small objects. Advances in detector sensitivity and focused beam optics have allowed for the development of systems where exposure times have been shortened from hours to seconds. Shorter exposures coupled with lower-intensity radiation have allowed researchers to prepare smaller crystals, avoid damage to samples and speed up their data runs.
Advanced X-ray detectors with image displays inspect the quality of goods being produced or packaged on a production line. With these systems, the goods do not have to be brought to a special screening area and the production line does not have to be disrupted. The systems range from portable, hand-held models to large automated systems. They are used on such products as aircraft and rocket parts and structures, canned and packaged foods, electronics, semiconductors and microchips, thermal insulations and automobile tires.
X-ray beam lithography can produce extremely fine lines and has applications for developing computer chips and other semiconductor related devices. Several companies are researching the use of focused X-ray synchrotron beams as the energy source for this process, since these powerful beams produce good pattern definition with relatively short exposure times. The grazing incidence optics — that is, the need to skip X-rays off a smooth mirror surface like a stone across a pond and then focus them elsewhere — developed for Chandra were the highest precision X-ray optics in the world and directly influenced this work.
The first X-ray baggage inspection system for airports used detectors nearly identical to those flown in the Apollo program to measure fluorescent X-rays from the Moon. Its design took advantage of the sensitivity of the detectors that enabled the size, power requirements and radiation exposure of the system to be reduced to limits practical for public use, while still providing adequate resolution to effectively screen baggage. The company that developed the technology later developed a system that can simultaneously image, on two separate screens, materials of high atomic weight (e.g. metal hand guns) and materials of low atomic weight (e.g. plastic explosives) that pass through other systems undetected. Variations of these machines are used to screen visitors to public buildings around the world.
Check out Chandra’s 20th anniversary page to see how they are celebrating.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
Through our Student Payload Opportunity with Citizen Science, or SPOCS, we’re funding five college teams to build experiments for the International Space Station. The students are currently building their experiments focusing on bacteria resistance or sustainability research. Soon, these experiments will head to space on a SpaceX cargo launch! University of Idaho SPOCS team lead Hannah Johnson and NASA STEM on Station activity manager Becky Kamas will be taking your questions in an Answer Time session on Thurs., June 3, from 12-1 p.m. EDT here on our Tumblr! Make sure to ask your question now by visiting http://nasa.tumblr.com/ask. Hannah Johnson recently graduated from the University of Idaho with a Bachelor of Science in Chemical Engineering. She is the team lead for the university’s SPOCS team, Vandal Voyagers I, designing an experiment to test bacteria-resistant polymers in microgravity. Becky Kamas is the activity manager for STEM on Station at our Johnson Space Center in Houston. She helps connect students and educators to the International Space Station through a variety of opportunities, similar to the ones that sparked her interest in working for NASA when she was a high school student. Student Payload Opportunity with Citizen Science Fun Facts:
Our scientists and engineers work with SPOCS students as mentors, and mission managers from Nanoracks help them prepare their experiments for operation aboard the space station.
The Vandal Voyagers I team has nine student members, six of whom just graduated from the Department of Chemical and Biological Engineering. Designing the experiment served as a senior capstone project.
The experiment tests polymer coatings on an aluminum 6061 substrate used for handles on the space station. These handles are used every day by astronauts to move throughout the space station and to hold themselves in place with their feet while they work.
The University of Idaho’s SPOCS project website includes regular project updates showing the process they followed while designing and testing the experiment.
Make sure to follow us on Tumblr for your regular dose of space: http://nasa.tumblr.com.
Explore the universe and discover our home planet with the official NASA Tumblr account
1K posts