- uploaded: Aug 9, 2011
- Hits: 231
What is Real? Did Japan happen our is there a small group of people that controls us to believe? Japan is the land of the Rising Sun. Question what is true or been given as true. Japan Rocks. I hope i'm right. because that would imply that everyone is save. Some things are only real when you can touch them. IN LAK'ECH... NATURE LOVES US.