Hawaii keeps coming up in my life lately. My parents are on vacation in Hawaii, I read a book about a Princess in Hawaii, ordered a Hawaiian pizza (LOL). Which made me wonder about the history of Hawaii. I don't ever remember learning in school about how it was Hawaii became a part of the United States. I remember learning it was first a territory, and that it became a state in 1959. From what I can read about it, it seems it was nothing but greed that made the Americans dethrone the Hawaiian Queen, and take over. Sometimes I think it is so hard to figure out what is truly history, what really happened, and the watered down versions we are supposed to believe. Does anyone know anything about the history of Hawaii? I would love to know more and understand more.
"Elphaba, where I come from, we believe all sorts of things that aren't true. We call it History." The Wizard in Wicked.
2 comments:
I find it interesting that while the varied opinions that color just about everything often encircle one golden strand of truth. Finding that one golden strand though is often incredibly difficult as we look through the many colored strands surrounding it.
I'd like to be the princess of Hawaii.
Post a Comment