Are gender roles determined through culture or biology?

Do the roles women and men adopt in society evolve as the culture they grow up in evolves, or are the biologically determined?

Is culture a consequence of biology?

Do you think our existing culture is one of many that could possibly have evolved, or have we attained some kind of cultural pinnacle, as the dominant culture?

Comments on "Are gender roles determined through culture or biology?"