National Geographic, part of The Walt Disney Company, continues its legacy of exploration, science, and storytelling. Through award-winning documentaries, series, and global expeditions, National Geographic educates and inspires audiences about the natural world. Disney+ features exclusive content, from nature specials to behind-the-scenes explorations. Stay updated on new documentaries, conservation efforts, and National Geographic projects at SamsDisneyDiary!