Gender roles

Why do we have to blur the lines of gender roles? Is it so bad to claim it is a man's job to take out the trash? Is it so bad to just tell a kid it's not nice to call someone gay, and let it be? Why tell them that they may be gay, and just not know it? Why not let kids be kids? Why does it matter if that kitchen set is marketed toward girls and the cars are marketed toward boys? Are we not able to tell our boys that even though this kitchen set may feature girls on the packaging that it's totally acceptable for them to play with it as well? Or can we not tell our daughters that they can play with HotWheels cars even though they have boys on the front? Would that not be MORE effective in the long run?