My question on rpg.stackexchange.com seems to have reached a point where a "history-person" would be quite suitable to answer it. So let me rephrase it, so as to be at least marginally suitable for this site.
Which were the continent-wide common languages during human history (I can think of English, Latin, Greek in reverse time order)? What percent of the populace spoke those languages? What percent of the literate populace spoke those languages?
Well, these days I'd say Spanish certainly counts. It is spoken as a first language in just about every country in the Americas south of the Rio Grande (Brazil being the most prominent exception). North of there, English has roughly the same status.
Historically, the best analog I know of is Mongol, which at one point was spoken across Asia from Russia to Manchuria (China too, but only by the rulers). I don't have numbers on 13th century Asian literacy, sad to say. I'd guess that few Mongols were literate. Their alphabet was brand new at the time of their empire, and being pastoralists by culture most of them would have had little use for it. Then again, your typical Chinese or European peasant didn't have much use for literacy in the 13th century either. (I should note here that these days Mongolia's literacy rate is a respectable 97.5%, which is quite a bit better than neighboring China, and puts them slightly more literate than Greece)
I'm going to introduce my definition of "Continental" size as an entity with at least 1 million square miles, and 100 million people in its modern population.
English is one such language, spoken in "North America," specifically Canada and the United States. Not to mention a number of countries that make up the former "India" and the current Indian subcontinent.
Spanish is spoken in the most of the South American Continent (except for Brazil). Portuguese is spoken in Brazil, which meets my definition of "Continental size.
Greek was spoken not only in Greece, but in the "subcontinent" of Asia minor, basically the empire of Alexander the Great. Ditto for Persian, in Asia Minor, when they ruled before Alexander.
Under the Roman empire, Latin was spoken in southern western Europe, enough of Europe to meet my definition of "continent."
Chinese is spoken in China, a "confederation" of land and people of Continental size. Russia, where Russian is spoken, is larger than most continents.
From the question:
You and I speak "Common" - it's called English. But this is the result of the recent globalization made possible with the advent of the Internet.
This is incorrect. It is the result of Imperial Conquest, and that, I think, is the real heart of the matter.
If you look at Common as an imperial language - an official language of government - then yes, human kingdoms at war and "non-humans" (and let's be frank, that notion is grounded firmly in Tolkien's quaint Victorian notions of race) would both speak it and their own language, even if the empire is a fading memory.
Take, for instance, not a continent, but two subcontinents: India and Europe.
In medieval Europe, if you knew Latin, you could generally find someone in town who knew it as well - clergyman or clerk - and you could fake your way through a conversation in a place that spoke a romance language, if you hit your language rolls. So, Latin as "Common" would still require a player to sink some stats in languages if they want to talk to random villagers.
In modern India, you have the language of Empire, English - if you do business or deal with the law or government, you speak it. You also have the language of faith - Hindi - which even non-Hindus learn in order to communicate with others in the community. Then you have twenty one "mother tongues" - languages learned from your mother, this is the official language of the place where you live. Of course, there are even more unofficial "mother tongues", the language of your ethnicity, of your social caste, of your particular village that's different than the province's.
So, you would have an imperial language or two, "Common1, Common2", and some NPCs might know one better than other NPCs, but most everyone would know a smattering of either. Mother Tongues can be then broken down by race (ethnic-centric language) and alignment (caste-centric language).
So the way to run the campaign is to make the players roll language skill to speak common, to see if they can actually communicate. Knowing another "Mother Tongue" fantasy language, elvish or evil, improves the ability to talk to those NPC's that might also know them, even in passing.
Jacques Cousteau has a story of his wife, French, trying to hail a Greek captain on a nearby yacht, and both parties are attempting to say hello in every language they know - and though France and Greece are only a few hundred miles apart, they wind up speaking in Japanese! In this way, learning languages should improve a character's ability to speak with others generally.
See also, Lingua Franca a "third language" that people who don't know each other's languages communicate in, and poorly.
Everyone knows Common. No-one knows common very well.
Well firstly, what do you mean by 'continent'? Is Europe a continent? Is India?
Remember that now lots of people in the same country (upper & lower class) speak roughly the same language. However that wasn't always the case. You can see this in some places where minorities who have very little power would not speak the language of government. e.g. serfs in the field speaking Old English and the Norman lords speaking Norman French.
Also, people in different classes/professions would know different languages. e.g. catholic priests and other educated people would know Latin, Orthodox preists might know Greek in the mediæval period. In later centuries, educated people might know French. But that doesn't mean the common man on the ground might know Latin or French.
Here is a list of SOME "continent wide languages in human history":
- Ancient Greek:
a. The Greek language was communicated throughout much of the Mediterranean and Black Sea regions beginning in the 700's BC/BCE. Greek was widely communicated throughout the Southern and Southeast European regions during ancient times.
b. With the rise of Alexander The Great, the Greek language truly became internationalized by reaching into Egypt & the Middle East regions, thus expanding Greek beyond Southern Europe & Anatolia-(present-day Turkey).
Latin: With the defeat of the Carthaginian and various Greek imperial dynasties, the Roman Empire also spread the Latin language to greater distances within much of continental Europe-(when compared with the Ancient Greeks). The Latin linguistic legacy lived beyond the fall of Rome and into the Medieval period.
Arabic: The Arabic language, has been since Medieval times, the national language of the entirety of North Africa, as well as a sizable portion of West Asia/The Middle East.
Spanish: Every country in South America-(with the exception of Brazil), has been a Spanish speaking state since the mid 1500's. Every state in Central America has also been Spanish speaking for nearly 500 years. Even a sizable portion of the United States, during our early history, was primarily Spanish speaking.
English: The English language, is probably, the most widely spoken language in the world during late Modern times and into the contemporary age. English is the central language of the United States, as well as the majority of Canada. English, is a second language in many parts of Europe and is the central language of Australia. The continental legacy of the English language was due to the massive presence of British colonialism during Modern times, followed by the worldwide presence and influence of the United States since 1945.
Russia: The Russian Empire, followed by the Soviet "Union"/(or Empire), had Russian as either a central or secondary language across much of continental Asia. Countries, such as Kazakhstan, Uzbekistan, Tajikistan, the Baltic countries and Eastern Europe-(during the Cold War), communicated in Russian either as a primary or secondary language.
Just to provide this widely interpretable question an answer that is specific to China, today there is a common language called putonghua that is based on the Beijing dialect of Mandarin. While millions of Chinese people speak the Beijing dialect as their first dialect, in much of China (see map) local people speak a dialect of Mandarin that is somewhat different from putonghua, and is at times not mutually intelligible. It might be like someone from Scotland learning to speak American English, and an American learning to understand people in Scotland. (I found it hard) In my experience, everyone in these areas under the age to 50 is also conversant in putonghua.
Furthermore, there are a bunch of people in China (~ 300 million?) who speak speak putonghua as a second language, with their first language being completely different from mandarin, i.e. Wu, Min, Gan, Xiang, Hui, Yue (Cantonese), Ping, etc.. Interestingly, some of these languages also have dialects that are not mutually intelligible. I recently met a person from Guangdong who spoke 3 Yue dialects, and two mandarin dialects, and English. Would this be 6 languages, or 3? It can be argued either way (he is a computer scientist).
Here is a great blog post by the economist about the difference between languages & dialects in China.
Putonghua, along with simplified characters and Hanyu Pinyin, were standardized by the CCP regime in the 1950s with a lot of assistance from Soviet advisors who had a ton of experience with language policy and literacy. Some of you may be interested to know that the predecessor to hanyu pinyin, named Latinxua Sin Wenz, was developed in the Soviet Union to encourage the literacy of Chinese migrant workers resident in the Sibera from Shandong province. The motives of the CCP regime in enforcing the Beijing dialect nationwide has an interesting story, but it is too long, and to tangential, to be included as an answer to this question.
Prior to the 1950s, China had many older standardizations. I don't know too much about it, but this This Site question provides a bunch of details about standardized Chinese covers back to about 500 BCE.
Who Were Celts
The Celts were a collection of tribes with origins in central Europe that shared a similar language, religious beliefs, traditions and culture. It’s believed that the Celtic culture started to evolve as early as 1200 B.C. The Celts spread throughout western Europe—including Britain, Ireland, France and Spain—via migration. Their legacy remains most prominent in Ireland and Great Britain, where traces of their language and culture are still prominent today.
The existence of the Celts was first documented in the seventh or eighth century B.C. The Roman Empire, which ruled much of southern Europe at that time, referred to the Celts as “Galli,” meaning barbarians.
However, the Celts (pronounced with a hard 𠇌” or “k” sound) were anything but barbarians, and many aspects of their culture and language have survived through the centuries.
Preserved tattoos on ancient mummified human remains reveal that tattooing has been practiced throughout the world for many centuries.  In 2015, scientific re-assessment of the age of the two oldest known tattooed mummies identified Ötzi as the oldest example then known. This body, with 61 tattoos, was found embedded in glacial ice in the Alps, and was dated to 3250 BCE.   In 2018, the oldest figurative tattoos in the world were discovered on two mummies from Egypt which are dated between 3351 and 3017 BCE. 
Ancient tattooing was most widely practiced among the Austronesian people. It was one of the early technologies developed by the Proto-Austronesians in Taiwan and coastal South China prior to at least 1500 BCE, before the Austronesian expansion into the islands of the Indo-Pacific.    It may have originally been associated with headhunting.  Tattooing traditions, including facial tattooing, can be found among all Austronesian subgroups, including Taiwanese Aborigines, Islander Southeast Asians, Micronesians, Polynesians, and the Malagasy people. For the most part Austronesians used characteristic perpendicularly hafted tattooing points using a wooden mallet to tap the handle and drive the tattooing points into the skin. The handle and mallet were generally made of wood while the points, either single, grouped or arranged to form a comb were made of Citrus thorns, fish bone, bone, teeth and turtle and oyster shells.    
Ancient tattooing traditions have also been documented among Papuans and Melanesians, with their use of distinctive obsidian skin piercers. Some archeological sites with these implements are associated with the Austronesian migration into Papua New Guinea and Melanesia. But other sites are older than the Austronesian expansion, being dated to around 1650 to 2000 BCE, suggesting that there was a preexisting tattooing tradition in the region.  
Among other ethnolinguistic groups, tattooing was also practiced among the Ainu people of Japan  some Austroasians of Indochina  Berber women of Tamazgha (North Africa)  the Yoruba, Fulani and Hausa people of Nigeria  Native Americans of the Pre-Columbian Americas    and the Welsh and Picts of Iron Age Britain. 
Cemeteries throughout the Tarim Basin (Xinjiang of western China) including the sites of Qäwrighul, Yanghai, Shengjindian, Zaghunluq, and Qizilchoqa have revealed several tattooed mummies with Western Asian/Indo-European physical traits and cultural materials. These date from between 2100 and 550 BC. 
In ancient China, tattoos were considered a barbaric practice associated with the Yue peoples of southeastern and southern China. Tattoos were often referred to in literature depicting bandits and folk heroes. As late as the Qing Dynasty, [ when? ] it was common practice to tattoo characters such as 囚 ("Prisoner") on convicted criminals' faces. Although relatively rare during most periods of Chinese history, slaves were also sometimes marked to display ownership.
However, tattoos seem to have remained a part of southern culture. Marco Polo wrote of Quanzhou, "Many come hither from Upper India to have their bodies painted with the needle in the way we have elsewhere described, there being many adepts at this craft in the city". At least three of the main characters – Lu Zhishen, Shi Jin (史進), and Yan Ching (燕青) – in the classic novel Water Margin are described as having tattoos covering nearly all of their bodies. Wu Song was sentenced to a facial tattoo describing his crime after killing Xi Menqing (西門慶) to avenge his brother. In addition, Chinese legend claimed the mother of Yue Fei (a famous Song general) tattooed the words "Repay the Country with Pure Loyalty" ( 精忠報國 , jing zhong bao guo) down her son's back before he left to join the army.
The earliest possible evidence for tattooing in Europe appears on ancient art from the Upper Paleolithic period as incised designs on the bodies of humanoid figurines.  The Löwenmensch figurine from the Aurignacian culture dates to approximately 40,000 years ago  and features a series of parallel lines on its left shoulder. The ivory Venus of Hohle Fels, which dates to between 35,000 and 40,000 years ago  also exhibits incised lines down both arms, as well as across the torso and chest.
The oldest and most famous direct proof of ancient European tattooing appears on the body of Ötzi the Iceman, who was found in the Ötz valley in the Alps and dates from the late 4th millennium BC.  Studies have revealed that Ötzi had 61 carbon-ink tattoos consisting of 19 groups of lines simple dots and lines on his lower spine, left wrist, behind his right knee and on his ankles. It has been argued that these tattoos were a form of healing because of their placement, though other explanations are plausible. 
The Picts may have been tattooed (or scarified) with elaborate, war-inspired black or dark blue woad (or possibly copper for the blue tone) designs. Julius Caesar described these tattoos in Book V of his Gallic Wars (54 BC). Nevertheless, these may have been painted markings rather than tattoos. 
In his encounter with a group of pagan Scandinavian Rus' merchants in the early 10th century, Ahmad ibn Fadlan describes what he witnesses among them, including their appearance. He notes that the Rus' were heavily tattooed: "From the tips of his toes to his neck, each man is tattooed in dark green with designs, and so forth."  Raised in the aftermath of the Norman conquest of England, William of Malmesbury describes in his Gesta Regum Anglorum that the Anglo-Saxons were tattooed upon the arrival of the Normans (. "arms covered with golden bracelets, tattooed with coloured patterns . "). 
The significance of tattooing was long open to Eurocentric interpretations. In the mid-19th century, Baron Haussmann, while arguing against painting the interior of Parisian churches, said the practice "reminds me of the tattoos used in place of clothes by barbarous peoples to conceal their nakedness". 
Greece and Rome Edit
Greek written records of tattooing date back to at least the 5th-century BCE.  : 19 The ancient Greeks and Romans used tattooing to penalize slaves, criminals, and prisoners of war. While known, decorative tattooing was looked down upon and religious tattooing was mainly practiced in Egypt and Syria.  : 155 According to Robert Graves in his book The Greek Myths, tattooing was common amongst certain religious groups in the ancient Mediterranean world, which may have contributed to the prohibition of tattooing in Leviticus. The Romans of Late Antiquity also tattooed soldiers and arms manufacturers, a practice that continued into the ninth century.  : 155
The Greek verb stizein (στίζειν), meaning "to prick," was used for tattooing. Its derivative stigma (στίγμα) was the common term for tattoo marks in both Greek and Latin.  : 142 During the Byzantine period, the verb kentein (κεντεῖν) replaced stizein, and a variety of new Latin terms replaced stigmata including signa "signs," characteres "stamps," and cicatrices "scars."  : 154–155
British and other pilgrims to the Holy Lands throughout the 17th century were tattooed with the Jerusalem Cross to commemorate their voyages,  including William Lithgow in 1612. 
In 1691, William Dampier brought to London a Filipino man named Jeoly or Giolo from the island of Mindanao (Philippines) who had a tattooed body and became known as the "Painted Prince".
Between 1766 and 1779, Captain James Cook made three voyages to the South Pacific, the last trip ending with Cook's death in Hawaii in February 1779. When Cook and his men returned home to Europe from their voyages to Polynesia, they told tales of the 'tattooed savages' they had seen. The word "tattoo" itself comes from the Tahitian tatau, and was introduced into the English language by Cook's expedition [ citation needed ] (though the word 'tattoo' or 'tap-too', referring to a drumbeat, had existed in English since at least 1644) 
It was in Tahiti aboard the Endeavour, in July 1769, that Cook first noted his observations about the indigenous body modification and is the first recorded use of the word tattoo to refer to the permanent marking of the skin. In the ship's log book recorded this entry: "Both sexes paint their Bodys, Tattow, as it is called in their Language. This is done by inlaying the Colour of Black under their skins, in such a manner as to be indelible." Cook went on to write, "This method of Tattowing I shall now describe. As this is a painful operation, especially the Tattowing of their Buttocks, it is performed but once in their Lifetimes."
Cook's Science Officer and Expedition Botanist, Sir Joseph Banks, returned to England with a tattoo. Banks was a highly regarded member of the English aristocracy and had acquired his position with Cook by putting up what was at the time the princely sum of some ten thousand pounds in the expedition. In turn, Cook brought back with him a tattooed Raiatean man, Omai, whom he presented to King George and the English Court. Many of Cook's men, ordinary seamen and sailors, came back with tattoos, a tradition that would soon become associated with men of the sea in the public's mind and the press of the day.  In the process, sailors and seamen re-introduced the practice of tattooing in Europe, and it spread rapidly to seaports around the globe.
By the 19th century, tattooing had spread to British society but was still largely associated with sailors  and the lower or even criminal class.  Tattooing had however been practised in an amateur way by public schoolboys from at least the 1840s   and by the 1870s had become fashionable among some members of the upper classes, including royalty.   In its upmarket form, it could be a lengthy, expensive  and sometimes painful  process.
Tattooing spread among the upper classes all over Europe in the 19th century, but particularly in Britain where it was estimated in Harmsworth Magazine in 1898 that as many as one in five members of the gentry were tattooed. Taking their lead from the British Court, where George V followed Edward VII's lead in getting tattooed King Frederick IX of Denmark, the King of Romania, Kaiser Wilhelm II, King Alexander of Yugoslavia and even Tsar Nicholas II of Russia, all sported tattoos, many of them elaborate and ornate renditions of the Royal Coat of Arms or the Royal Family Crest. King Alfonso XIII of modern Spain also had a tattoo.
The perception that there is a marked class division on the acceptability of the practice has been a popular media theme in Britain, as successive generations of journalists described the practice as newly fashionable and no longer for a marginalised class. Examples of this cliché can be found in every decade since the 1870s.  Despite this evidence, a myth persists that the upper and lower classes find tattooing attractive and the broader middle classes rejecting it. In 1969, the House of Lords debated a bill to ban the tattooing of minors, on grounds it had become "trendy" with the young in recent years but was associated with crime. It was noted that 40 per cent of young criminals had tattoos and that marking the skin in this way tended to encourage self-identification with criminal groups. Two peers, Lord Teynham and the Marquess of Aberdeen and Temair however rose to object that they had been tattooed as youngsters, with no ill effects.  Since the 1970s, tattoos have become more socially acceptable and fashionable among celebrities.  Tattoos are less prominent on figures of authority, and the practice of tattooing by the elderly is still considered remarkable. 
Malay Archipelago Edit
Several tribes in the insular parts have tattooing in their culture. One notable example is the Dayak people of Kalimantan in Borneo (Bornean traditional tattooing). Another ethnic group that practices tattooing are the Mentawai people, as well as Moi and Meyakh people in West Papua. 
Tattooing for spiritual and decorative purposes in Japan is thought to extend back to at least the Jōmon or Paleolithic period and was widespread during various periods for both the Yamato and native Jomon groups. [ citation needed ] Chinese texts from before 300 AD described social differences among Japanese people as being indicated through tattooing and other bodiapanese.  Chinese texts from the time also described Japanese men of all ages as decorating their faces and bodies with tattoos. 
Between 1603 and 1868, Japanese tattooing was only practiced by the ukiyo (floating world) subculture. Generally firemen, manual workers and prostitutes wore tattoos to communicate their status. [ citation needed ] By the early 17th century, criminals were widely being tattooed as a visible mark of punishment. Criminals were marked with symbols typically including crosses, lines, double lines and circles on certain parts of the body, mostly the face and arms. These symbols sometimes designated the places where the crimes were committed. In one area, the character for "dog" was tattooed on the criminal's forehead.  : 77 
The Government of Meiji Japan, formed in 1868, banned the art of tattooing altogether, viewing it as barbaric and lacking respectability. This subsequently created a subculture of criminals and outcasts. These people had no place in "decent society" and were frowned upon. They could not simply integrate into mainstream society because of their obvious visible tattoos, forcing many of them into criminal activities which ultimately formed the roots for the modern Japanese mafia, the Yakuza, with which tattoos have become almost synonymous in Japan. [ citation needed ]
North Africa Edit
Egypt and Nubia Edit
Despite a lack of direct textual references, tattooed human remains and iconographic evidence indicate that ancient Egyptians practiced tattooing from at least 2000 BCE.   : 86,89 It is theorized that tattooing entered Egypt through Nubia,  : 23 but this claim is complicated by the high mobility between Lower Nubia and Upper Egypt as well as Egypt's annexation of Lower Nubia during the Middle Kingdom.  : 92 Archeologist Geoffrey J. Tassie argues that it may be more appropriate to classify tattoo in ancient Egypt and Nubia as part of a larger Nile Valley tradition.  : 93
The most famous tattooed mummies from this region are Amunet, a priestess of Hathor, and two Hathoric dancers from Dynasty XI that were found at Deir el-Bahari.  : 90 In 1898, Daniel Fouquet, a medical doctor from Cairo, wrote an article on medical tattooing practices in ancient Egypt  in which he describes the tattoos on these three mummies and speculates that they may have served a medicinal or therapeutic purpose: "The examination of these scars, some white, others blue, leaves in no doubt that they are not, in essence, ornament, but an established treatment for a condition of the pelvis, very probably chronic pelvic peritonitis." 
Ancient Egyptian tattooing appears to have been practiced on women exclusively with the possible exception of one extremely worn Dynasty XII stele, there is no artistic or physical evidence that men were tattooed.  However, by the Meroitic Period (300 BCE – 400 CE), it was practiced on Nubian men as well.  : 88
Accounts of early travelers to ancient Egypt describe the tool used as an uneven number of metal needles attached to a wooden handle.  : 86–87 
Two well-preserved Egyptian mummies from 4160 B.C.E., a priestess and a temple dancer for the fertility goddess Hathor, bear random dot and dash tattoo patterns on the lower abdomen, thighs, arms, and chest. 
Coptic tattoos often consist of three lines, three dots and two elements, reflecting the Trinity. The tools used had an odd number of needles to bring luck and good fortune.  : 87 Many Copts have the Coptic cross tattooed on the inside of their right arm.   : 145 This may have been influenced by a similar practice tattooing religious symbols on the wrists and arms during the Ptolemaic period.  : 91
Herodotus' writings suggest that slaves and prisoners of war were tattooed in Persia during the classical era. This practice spread from Persia to Greece and then to Rome.  : 146–147,155
The most famous depiction of tattooing in Persian literature goes back 800 years to a tale by Rumi about a man who is proud to want a lion tattoo but changes his mind once he experiences the pain of the needle. 
In the hamam (the baths), there were dallaks whose job was to help people wash themselves. This was a notable occupation because apart from helping the customers with washing, they were massage-therapists, dentists, barbers and tattoo artists. 
Tattooing has been a part of Filipino life since pre-Hispanic colonization of the Philippine Islands.  Tattooing in the Philippines, to some, were a form of rank and accomplishments, and some believed that tattoos had magical qualities. The more famous tattooed indigenous peoples of the Philippines resided in north Luzon, especially among the Bontoc, Kalinga and Ifugao peoples. The Visayans of the southern islands were also heavily tattooed. 
Filipino tattooing was first documented by the European Spanish explorers as they landed among the islands in the late 16th century, and they called the natives Los Pintados (The Painted Ones) as they mistook the tattoos for paint. Before European exploration, tattooing was widespread, but conversion to Christianity greatly diminished the practice as heathen or low-class. 
As Lane Wilcken's Filipino Tattoos Ancient to Modern denotes, there are many similarities between the tattooing traditions of the Philippines and indigenous Polynesian designs – not only with their societal function and similar designs, but in the tools used to hand-tap them a needle or thorn on a stick, with a hammer to pound it into the skin). While the most common modern term for indigenous tattoos is batok, an ancient Tagalog word for tattoos was tatak, extremely similar to the Samoan word tatau. 
Marquesas Islands Edit
New Zealand Edit
The Māori people of New Zealand practised a form of tattooing known as tā moko, traditionally created with chisels.
However, from the late 20th century onward, there has been a resurgence of tā moko taking on European styles amongst Maori. Traditional tā moko was reserved for head area. There is also a related tattoo art, kirituhi, which has a similar aesthetic to tā moko but is worn by non-Maori.
The traditional male tattoo in Samoa is called the pe'a. The traditional female tattoo is called the malu. The word tattoo is believed to have originated from the Samoan word tatau. [ citation needed ]
When the Samoan Islands were first seen by Europeans in 1722 three Dutch ships commanded by Jacob Roggeveen visited the eastern island known as Manua. A crew member of one of the ships described the natives in these words, "They are friendly in their speech and courteous in their behavior, with no apparent trace of wildness or savagery. They do not paint themselves, as do the natives of some other islands, but on the lower part of the body they wear artfully woven silk tights or knee breeches. They are altogether the most charming and polite natives we have seen in all of the South Seas. " [ citation needed ]
The ships lay at anchor off the islands for several days, but the crews did not venture ashore and did not even get close enough to the natives to realize that they were not wearing silk leggings, but their legs were completely covered in tattoos. [ citation needed ]
In Samoa, the tradition of applying tattoo, or tatau, by hand has been unbroken for over two thousand years. Tools and techniques have changed little. The skill is often passed from father to son, each tattoo artist, or tufuga, learning the craft over many years of serving as his father's apprentice. A young artist-in-training often spent hours, and sometimes days, tapping designs into sand or tree bark using a special tattooing comb, or au. Honoring their tradition, Samoan tattoo artists made this tool from sharpened boar's teeth fastened together with a portion of the turtle shell and to a wooden handle. [ citation needed ]
Traditional Samoan tattooing of the "pe'a", body tattoo, is an ordeal that is not lightly undergone. It takes many weeks to complete. The process is very painful and used to be a necessary prerequisite to receiving a matai title this however is no longer the case. Tattooing was also a very costly procedure. [ citation needed ]
Samoan society has long been defined by rank and title, with chiefs (ali'i) and their assistants, known as talking chiefs (tulafale). The tattooing ceremonies for young chiefs, typically conducted at the time of puberty, were part of their ascendance to a leadership role. The permanent marks left by the tattoo artists would forever celebrate their endurance and dedication to cultural traditions. The pain was extreme and the risk of death by infection was a concern to back down from tattooing was to risk being labeled a "pala'ai" or coward. Those who could not endure the pain and abandoned their tattooing were left incomplete, would be forced to wear their mark of shame throughout their life. This would forever bring shame upon their family so it was avoided at all cost. [ citation needed ]
The Samoan tattooing process used a number of tools which remained almost unchanged since their first use. "Autapulu" is a wide tattooing comb used to fill in the large dark areas of the tattoo. "Ausogi'aso tele" is a comb used for making thick lines. "Ausogi'aso laititi" is a comb used for making thin lines. "Aumogo" small comb is used for making small marks. "Sausau" is the mallet is used for striking the combs. It is almost two feet in length and made from the central rib of a coconut palm leaf. "Tuluma" is the pot used for holding the tattooing combs. Ipulama is the cup used for holding the dye. The dye is made from the soot collected from burnt lama nuts. "Tu'I" used to grind up the dye. These tools were primarily made out of animal bones to ensure sharpness. [ citation needed ]
The tattooing process itself would be 5 sessions, in theory. These 5 sessions would be spread out over 10 days in order for the inflammation to subside. [ citation needed ]
Christian missionaries from the west attempted to purge tattooing among the Samoans, thinking it barbaric and inhumane. Many young Samoans resisted mission schools since they forbade them to wear tattoos. But over time attitudes relaxed toward this cultural tradition and tattooing began to reemerge in Samoan culture. [ citation needed ]
Tattooed mummies dating to c. 500 BC were extracted from burial mounds on the Ukok plateau during the 1990s. Their tattooing involved animal designs carried out in a curvilinear style. The Man of Pazyryk, a Scythian chieftain, is tattooed with an extensive and detailed range of fish, monsters and a series of dots that lined up along the spinal column (lumbar region) and around the right ankle.
Solomon Islands Edit
Some artifacts dating back 3,000 years from the Solomon Islands may have been used for tattooing human skin. Obsidian pieces have been duplicated, then used to conduct tattoos on pig skin, then compared to the original artifacts. "They conducted these experiments to observe the wear, such as chipping and scratches, and residues on the stones caused by tattooing, and then compared that use-wear with 3,000 year old artifacts. They found that the obsidian pieces, old and new, show similar patterns, suggesting that they hadn't been used for working hides, but were for adorning human skin." 
In Taiwan, facial tattoos of the Atayal people are called ptasan they are used to demonstrate that an adult man can protect his homeland, and that an adult woman is qualified to weave cloth and perform housekeeping. 
Taiwan is believed to be the homeland of all the Austronesian peoples,   which includes Filipinos, Indonesians, Polynesians and Malagasy peoples, all with strong tattoo traditions. This along with the striking correlation between Austronesian languages and the use of the so-called hand-tapping method suggests that Austronesian peoples inherited their tattooing traditions from their ancestors established in Taiwan or along the southern coast of the Chinese mainland. 
Thai tattoos, also known as Yantra tattooing, was common since ancient times. Just as other native southeast Asian cultures, animistic tattooing was common in Tai tribes that were is southern China. Over time, this animistic practice of tattooing for luck and protection assimilated Hindu and Buddhist ideas. The Sak Yant traditional tattoo is practiced today by many and are usually given either by a Buddhist monk or a Brahmin priest. The tattoos usually depict Hindu gods and use the Mon script or ancient Khmer script, which were the scripts of the classical civilizations of mainland southeast Asia.
Central America Edit
A Spanish expedition led by Gonzalo de Badajoz in 1515 across what is today Panama ran into a village where prisoners from other tribes had been marked with tattoos.
[The Spaniards] found, however, some slaves who were branded in a painful fashion. The natives cut lines in the faces of the slaves, using a sharp point either of gold or of a thorn they then fill the wounds with a kind of powder dampened with black or red juice, which forms an indelible dye and never disappears. The Spaniards took these slaves with them. It seems that this juice is corrosive and produces such terrible pain that the slaves are unable to eat on account of their sufferings.
North America Edit
Indigenous People of North America Edit
Indigenous People of North America have a long history of tattooing. Tattooing was not a simple marking on the skin: it was a process that highlighted cultural connections to Indigenous ways of knowing and viewing the world, as well as connections to family, society, and place.  : xii
There is no way to determine the actual origin of tattooing for Indigenous People of North America.  : 44 The oldest known physical evidence of tattooing in North America was made through the discovery of a frozen, mummified, Inuit female on St. Lawrence Island, Alaska who had tattoos on her skin.  : 434 Through radiocarbon dating of the tissue, scientists estimated that the female came from the 16th century.  : 434 Until recently, archeologists have not prioritized the classification of tattoo implements when excavating known historic sites.  : 65 Recent review of materials found from the Mound Q excavation site point towards elements of tattoo bundles that are from pre-colonization times.  : 66–68 Scholars explain that the recognition of tattoo implements is significant because it highlights the cultural importance of tattooing for Indigenous People.  : 72
Early explorers to North America made many ethnographic observations about the Indigenous People they met. Initially, they did not have a word for tattooing and instead described the skin modifications as "pounce, prick, list, mark, and raze" to "stamp, paint, burn, and embroider."  : 3 In 1585–1586, Thomas Harriot, who was part of the Grenville Expedition, was responsible for making observations about Indigenous People of North America.  In A Brief and True Report of the New Found Land of Virginia, Harriot recorded that some Indigenous People had their skin dyed and coloured.  : 11 John White provided visual representations of Indigenous People in the form of drawings and paintings.  : 46–81 Harriot and White also provided information highlighting specific markings seen on Indigenous chiefs during that time.  : 74 In 1623, Gabriel Sagard was a missionary who described seeing men and women with tattoos on their skin.  : 145
The Jesuit Relations of 1652 describes tattooing among the Petun and the Neutrals:
But those who paint themselves permanently do so with extreme pain, using, for this purpose, needles, sharp awls, or piercing thorns, with which they perforate, or have others perforate, the skin. Thus they form on the face, the neck, the breast, or some other part of the body, some animal or monster, for instance, an Eagle, a Serpent, a Dragon, or any other figure which they prefer and then, tracing over the fresh and bloody design some powdered charcoal, or other black coloring matter, which becomes mixed with the blood and penetrates within these perforations, they imprint indelibly upon the living skin the designed figures. And this in some nations is so common that in the one which we called the Tobacco, and in that which -- on account of enjoying peace with the Hurons and with the Iroquois -- was called Neutral, I know not whether a single individual was found, who was not painted in this manner, on some part
of the body. 
From 1712 to 1717, Joseph François Lafitau, another Jesuit missionary, recorded how Indigenous People were applying tattoos to their skin and developed healing strategies in tattooing the jawline to treat toothaches.  : 33–36 Indigenous People had determined that certain nerves that were along the jawline connected to certain teeth, thus by tattooing those nerves, it would stop them from firing signals that led to toothaches.  : 35 Some of these early ethnographic accounts questioned the actual practice of tattooing and hypothesized that it could make people sick due to unsanitary approaches.  : 145
Scholars explain that the study of Indigenous tattooing is relatively new as it was initially perceived as behaviour for societies outside of the norm.  : xii The process of colonization introduced new views of what acceptable behaviour included, leading to the near erasure of the tattoo tradition for many nations.  However, through oral traditions, the information about tattoos and the actual practice of tattooing has persisted to present day.
However, St. Lawrence Iroquoians had used bones as tattooing needles.  In addition, turkey bone tattooing tools were discovered at an ancient Fernvale, Tennessee site, dated back to 3500–1600 BCE. 
Inuit People Edit
The Inuit People have a deep history of tattooing. In the Inuktituk language, the word kakiniit translates to the English word for tattoo  : 196 and the word tunniit means face tattoo.  Among the Inuit, some nations tattooed female faces and parts of the body to symbolize a girl transitioning into a woman, coinciding with the start of her first menstrual cycle.  : 197  A tattoo represented a woman's beauty, strength, and maturity.  : 197 This was an important practice because some Inuit believed that a woman could not transition into the spirit world without tattoos on her skin.  The Inuit People have oral traditions that describe how the raven and the loon tattooed each other giving cultural significance to both the act of tattooing and the role of those animals in Inuit history.  : 10 European missionaries colonized the Inuit People in the beginning of the 20th century and associated tattooing as an evil practice  : 196 "demonizing" anyone who valued tattoos.  Alethea Arnaquq-Baril has helped Inuit women to revitalize the practice of traditional face tattoos through the creation of the documentary Tunniit: Retracing the Lines of Inuit Tattoos, where she interviews elders from different communities asking them to recall their own elders and the history of tattoos.  The elders were able to recall the traditional practice of tattooing which often included using a needle and thread and sewing the tattoo into the skin by dipping the thread in soot or seal oil, or through skin poking using a sharp needle point and dipping it into soot or seal oil.  Hovak Johnston has worked with the elders in her community to bring the tradition of kakiniit back by learning the traditional ways of tattooing and using her skills to tattoo others. 
Osage Nation Edit
The Osage People used tattooing for a variety of different reasons. The tattoo designs were based on the belief that people were part of the larger cycle of life and integrated elements of the land, sky, water, and the space in between to symbolize these beliefs.  : 222–228 In addition, the Osage People believed in the smaller cycle of life, recognizing the importance of women giving life through childbirth and men removing life through warfare.  : 216 Osage men were often tattooed after accomplishing major feats in battle, as a visual and physical reminder of their elevated status in their community.  : 223 Some Osage women were tattooed in public as a form of a prayer, demonstrating strength and dedication to their nation.  : 223
Haudenosaunee People Edit
The Haudenosaunee People historically used tattooing in connection to war. A tradition for many young men was to go on a journey into the wilderness, fast from eating any food, and discover who their personal manitou was.  : 97 Scholars explain that this process of discovery likely included dreams and visions that would bring a specific manitou to the forefront for each young man to have.  : 97 The manitou became an important element of protection during warfare and many boys tattooed their manitou onto their body to symbolize cultural significance of the manitou to their lives.  : 109 As they showed success in warfare, male warriors had more tattoos, some even keeping score of all the kills they had made.  : 112 Some warriors had tattoos on their faces that tallied how many people they had scalped in their lifetime.  : 115
Tattooing in the early United States Edit
In the period shortly after the American Revolution, to avoid impressment by British Navy ships, sailors used government issued protection papers to establish their American citizenship. However, many of the descriptions of the individual described in the seamen's protection certificates were so general, and it was so easy to abuse the system, that many impressment officers of the Royal Navy simply paid no attention to them. "In applying for a duplicate Seaman's Protection Certificate in 1817, James Francis stated that he 'had a protection granted him by the Collector of this Port on or about 12 March 1806 which was torn up and destroyed by a British Captain when at sea.'" 
One way of making them more specific and more effective was to describe a tattoo, which is highly personal as to subject and location, and thus use that description to precisely identify the seaman. As a result, many of the official certificates also carried information about tattoos and scars, as well as any other specific identifying information. This also perhaps led to an increase and proliferation of tattoos among American seamen who wanted to avoid impressment. During this period, tattoos were not popular with the rest of the country. "Frequently the "protection papers" made reference to tattoos, clear evidence that individual was a seafaring man rarely did members of the general public adorn themselves with tattoos." 
"In the late eighteenth and early nineteenth centuries, tattoos were as much about self-expression as they were about having a unique way to identify a sailor's body should he be lost at sea or impressed by the British navy. The best source for early American tattoos is the protection papers issued following a 1796 congressional act to safeguard American seamen from impressment. These proto-passports catalogued tattoos alongside birthmarks, scars, race, and height. Using simple techniques and tools, tattoo artists in the early republic typically worked on board ships using anything available as pigments, even gunpowder and urine. Men marked their arms and hands with initials of themselves and loved ones, significant dates, symbols of the seafaring life, liberty poles, crucifixes, and other symbols." 
Sometimes, to protect themselves, the sailors requested not only that the tattoos be described, but that they would also be sketched out on the protection certificate as well. As one researched said, "Clerks writing the documents often sketched the tattoos as well as describing them." 
"Reintroduction" to the Western world Edit
The popularity of modern Western tattooing owes its origins in large part to Captain James Cook's voyages to the South Pacific in the 1770s, but since the 1950s a false belief has persisted that modern Western tattooing originated exclusively from these voyages.  : 16  Tattooing has been consistently present in Western society from the modern period stretching back to Ancient Greece,   [ dubious – discuss ] though largely for different reasons. A long history of European tattoo predated these voyages, including among sailors and tradesmen, pilgrims visiting the Holy Land  : 150–151   : 362, 366, 379–380 and on Europeans living among Native Americans. 
Tattoo historian Anna Felicity Friedman suggests a couple reasons for the "Cook Myth".  : 18–20 First, modern European words for the practice (e.g., "tattoo", "tatuaje", "tatouage", "Tätowierung", and "tatuagem") derive from the Tahitian word "tatau", which was introduced to European languages through Cook's travels. However, prior European texts show that a variety of metaphorical terms were used for the practice, including "pricked," "marked", "engraved," "decorated," "punctured," "stained," and "embroidered." Friedman also points out that the growing print culture at the time of Cook's voyages may have increased the visibility of tattooing despite its prior existence in the West.
The first documented professional tattooer in the United States was Martin Hildebrandt, a German immigrant who arrived in Boston, Massachusetts in 1846. [ citation needed ] Between 1861 and 1865, he tattooed soldiers on both sides in the American Civil War. The first documented professional tattooist (with a permanent studio, working on members of the paying public) in Britain was Sutherland Macdonald in the early 1880s. Tattooing was an expensive and painful process and by the late 1880s had become a mark of wealth for the crowned heads of Europe. [ citation needed ]
In 1891, New York tattooer Samuel O'Reilly patented the first electric tattoo machine, a modification of Thomas Edison's electric pen.
The earliest appearance of tattoos on women during this period were in the circus in the late 19th century. These "Tattooed Ladies" were covered — with the exception of their faces, hands, necks, and other readily visible areas — with various images inked into their skin. In order to lure the crowd, the earliest ladies, like Betty Broadbent and Nora Hildebrandt told tales of captivity they usually claimed to have been taken hostage by Native Americans that tattooed them as a form of torture. However, by the late 1920s the sideshow industry was slowing and by the late 1990s the last tattooed lady was out of business. 
The Tattoo Renaissance Edit
Tattooing has steadily increased in popularity since the invention of the electric tattoo machine.   In 1936, 1 in 10 Americans had a tattoo of some form.  In the late 1950s, Tattoos were greatly influenced by several artists in particular Lyle Tuttle, Cliff Raven, Don Nolan, Zeke Owens, Spider Webb and Don Ed Hardy. A second generation of artists, trained by the first, continued these traditions into the 1970s, and included artists such as Bob Roberts, Jamie Summers, and Jack Rudy. 
Since the 1970s, tattoos have become a mainstream part of global and Western fashion, common among both sexes, to all economic classes, and to age groups from the later teen years to middle age. The decoration of blues singer Janis Joplin with a wristlet and a small heart on her left breast, by the San Francisco tattoo artist Lyle Tuttle, has been called a seminal moment in the popular acceptance of tattoos as art. Formal interest in the art of the tattoo became prominent in the 1970s through the beginning of the 21st century.  For many young Americans, the tattoo has taken on a decidedly different meaning than for previous generations. The tattoo has "undergone dramatic redefinition" and has shifted from a form of deviance to an acceptable form of expression. 
In 1988, scholar Arnold Rubin created a collection of works regarding the history of tattoo cultures, publishing them as the "Marks of Civilization".  In this, the term "Tattoo Renaissance" was coined, referring to a period marked by technological, artistic and social change.  Wearers of tattoos, as members of the counterculture began to display their body art as signs of resistance to the values of the white, heterosexual, middle-class.  The clientele changed from sailors, bikers, and gang members to the middle and upper class. There was also a shift in iconography from the badge-like images based on repetitive pre-made designs known as flash to customized full-body tattoo influenced by Polynesian and Japanese tattoo art, known as sleeves, which are categorized under the relatively new and popular avant-garde genre.  Tattooers transformed into "Tattoo Artists": men and women with fine art backgrounds began to enter the profession alongside the older, traditional tattooists.
Tattoos have experienced a resurgence in popularity in many parts of the world, particularly in Europe, Japan, and North and South America. The growth in tattoo culture has seen an influx of new artists into the industry, many of whom have technical and fine arts training. Coupled with advancements in tattoo pigments and the ongoing refinement of the equipment used for tattooing, this has led to an improvement in the quality of tattoos being produced. 
Star Stowe (Miss February 1977) was the first Playboy Playmate with a visible tattoo on her centerfold.
During the 2000s, the presence of tattoos became evident within pop culture, inspiring television shows such as A&E's Inked and TLC's Miami Ink and LA Ink. In addition, many celebrities have made tattoos more acceptable in recent years.
Contemporary art exhibitions and visual art institutions have featured tattoos as art through such means as displaying tattoo flash, examining the works of tattoo artists, or otherwise incorporating examples of body art into mainstream exhibits. One such 2009 Chicago exhibition, Freaks & Flash, featured both examples of historic body art as well as the tattoo artists who produced it. 
In 2010, 25% of Australians under age 30 had tattoos.  Mattel released a tattooed Barbie doll in 2011, which was widely accepted, although it did attract some controversy. 
Author and Sociology professor Beverly Yuen Thompson wrote "Covered In Ink: Tattoos, Women, and the Politics of the Body" (published in 2015, research conducted between 2007 and 2010) on the history of tattooing, and how it has been normalized for specific gender roles in the USA. She also released a documentary called "Covered", showing interviews with heavily tattooed women and female tattoo artists in the US. From the distinct history of tattooing, its historical origins and how it transferred to American culture, come transgressive styles which are put in place for tattooed men and women. These "norms" written in the social rules of tattooing imply what is considered the correct way for a gender to be tattooed.  Men of tattoo communities are expected to be "heavily tattooed", meaning there are many tattoos which cover multiple parts of the body, and express aggressive or masculine images, such as skulls, zombies, or dragons. Women, on the other hand, are expected to be "lightly tattooed". This means the opposite, in which there are only a small number of tattoos which are placed in areas of the body that are easy to cover up. These images are expected to be more feminine or cute (ex. Fairies, flowers, hearts). When women step outside of the "lightly tattooed" concept by choosing tattoos of a masculine design, and on parts of the body which are not easy to cover (forearms, legs), it's common to face certain types of discrimination from the public.  Women who are heavily tattooed can report to being stared at in public, being denied certain employment opportunities, face judgement from members of family, and may even receive sexist or homophobic slurs by strangers.
Over the past three decades Western tattooing has become a practice that has crossed social boundaries from "low" to "high" class along with reshaping the power dynamics regarding gender. It has its roots in "exotic" tribal practices of the Native Americans and Japanese, which are still seen in present times.
As various kinds of social movements progressed bodily inscription crossed class boundaries, and became common among the general public. Specifically, the tattoo is one access point for revolutionary aesthetics of women. Feminist theory has much to say on the subject. "Bodies of Subversion: A Secret History of Women and Tattoo", by Margot Mifflin, became the first history of women's tattoo art when it was released in 1997. In it, she documents women's involvement in tattooing coinciding to feminist successes, with surges in the 1880s, 1920s and the 1970s.  Today, women sometimes use tattoos as forms of bodily reclamation after traumatic experiences like abuse or breast cancer.  In 2012, tattooed women outnumbered men for the first time in American history - according to a Harris poll, 23% of women in America had tattoos in that year, compared to 19% of men.  In 2013, Miss Kansas, Theresa Vail, became the first Miss America contestant to show off tattoos during the swimsuit competition — the insignia of the U.S. Army Dental Corps on her left shoulder and one of the "Serenity Prayer" along the right side of her torso. 
The legal status of tattoos is still developing. In recent years, various lawsuits have arisen in the United States regarding the status of tattoos as a copyrightable art form. However, these cases have either been settled out of court or are currently being disputed, and therefore no legal precedent exists directly on point.  The process of tattooing was held to be a purely expressive activity protected by the First Amendment by the Ninth Circuit in 2010. 
Tattoos are valuable identification marks because they tend to be permanent. They can be removed, but they do not fade, The color may, however, change with exposure to the sun. they have recently been very useful in an identifying people.  In today's industrialized cultures, tattoos and piercing are a popular art form shared by people of all ages. They also are indicative defiance, independence, and belonging, as for example in prison or gang cultures.  These tattoos may also very harmful for the skin and lead to skin care issues 
Throughout the world's different military branches, tattoos are either regulated under policies or strictly prohibited to fit dress code rules.
United States Edit
United States Air Force Edit
The United States Air Force regulates all kinds of body modification. Any tattoos which are deemed to be "prejudicial to good order and discipline", or "of a nature that may bring discredit upon the Air Force" are prohibited. Specifically, any tattoo which may be construed as "obscene or advocate sexual, racial, ethnic or religious discrimination" is disallowed. Tattoo removal may not be enough to qualify resultant "excessive scarring" may be disqualifying. Further, Air Force members may not have tattoos on their neck, face, head, tongue, lips or scalp. 
United States Army Edit
The United States Army regulates tattoos under AR 670–1, last updated in 2015. Soldiers are permitted to have tattoos as long as they are not on the neck, hands, or face, with exceptions existing for of one ring tattoo on each hand and permanent makeup. Additionally, tattoos that are deemed to be sexist, racist, derogatory, or extremist continue to be banned. 
United States Coast Guard Edit
The United States Coast Guard policy has changes over the years. Tattoos should not be visible over the collarbone or when wearing a V-neck shirt. Tattoos or military brands on the arms should not surpass the wrist. But only one hand tattoos of a form of ring are permitted when not exceeding 1 / 4 inch width. Face tattoos are also permitted as permanent eyeliners for females as long as they are appropriately worn and not brightly colored to fit uniform dressing code. Disrespectful derogatory tattoos and sexually explicit are prohibited on the body. 
United States Marines Edit
The United States Marine Corps has disclosed a new policy meeting their new standards of professionalism in the military appearance, on the Marine Corps Bulletin 1020 released on 6 February 2016, substituting any previous policy from the past. 
The new policy in the Marine Corps unauthorized tattoo's in different parts of the body such as the wrist, knee, elbow and above the collar bone. Wrist tattoos have to be two inches above the wrist, elbow tattoos two inches above and one inch below, and the knee two inches above and two below. 
United States Navy Edit
The United States Navy has changed its policies [ when? ] and become more lenient when it comes to tattoos. For the first time the navy is allowing sailors to have neck tattoos as long as one inch. Sailors will also be allowed to have as many tattoos of any size on the arms, and legs as long as they are not deemed to be offensive tattoos. 
The Indian Army tattoo policy has been in place since 11 May 2015. The government declared all tribal communities who enlist and have tattoos, are allowed to have them all over the body only if they belong to a tribal community. Indians who are not part of a tribal community are only allowed to have tattoos in designated parts of the body such as the forearm, elbow, wrist, the side of the palm, and back and front of hands. Offensive sexist and racist tattoos are not allowed. 
Do All Languages Derive from a Single Common Ancestor?
The Tower of Babel story is a fanciful attempt to account for a very real question: What was the first language and why are there now so many of them?
The video below from TED Ed shows a brief history of how languages evolve, as speakers of the same language lose contact with each other in the centuries after migration and gradually drift linguistically in different directions.
What’s most interesting is not simply how we got multiple languages but rather how we determine, without the benefit of a time machine, which modern languages are related. To do this, historical linguists compare large numbers of words in different languages, looking for similarities that can’t be explained by other factors, such as onomatopoeia (the word for cat is something like “miao” in several languages, but, well, there’s likely an obvious reason for that) or borrowing (the word for tea in most languages is something like te or cha, but those can both be traced back to trade routes from different parts of China).
World Atlas of Linguistic Structures, Feature 138A: Tea by Östen Dahl
Similarities that are solid evidence of common ancestry may at first not look like similarities at all. For example, compare the English words father, foot, far, and five with the Ancient Greek words meaning the same thing: pater, podos, per (technically “forward”), and pente. Notice anything? The English terms all begin with an “f” sound while the Ancient Greek ones start with a “p” sound. When you piece together a whole series of systematic parallels like this across several languages (we can add in Latin pedes and German Fuß, both meaning “foot,” for example), you can begin figuring out what the common ancestor, known as a proto-language, might have looked like.
The common ancestor of English, Latin, Greek, Russian, Gaelic, Hindi, and many other languages spoken in Europe and India is known as Proto-Indo-European, whereas the more recent common ancestor of just English, German, Dutch, Norwegian and the other Germanic languages is known as Proto-Germanic. The video below describes more of these systematic sound changes between Proto-Germanic and the rest of the Indo-European languages, and how they were discovered by linguists including the Brothers Grimm (yes, those Brothers Grimm). More in the video below.
We can do pretty well going step-by-step with this base-level comparison of languages—whether modern or those for which we only have written records—which has enabled linguists to devise around 50 proto-languages to varying levels of detail. But the real time-machine problem kicks in when we attempt to go even further back, back to what the common ancestor of these proto-languages might have been. Since there aren’t any modern human societies that are incapable of language and any baby can learn any language, it’s not unreasonable to suppose that we were probably using language when the first genetically modern humans began to spread throughout, and out of, Africa. But unlike cooking utensils or hunting weapons, languages don’t leave physical artifacts of being spoken, and writing of any kind wasn’t invented until somewhere between 50,000 and 300,000 years later. Ish.
And unfortunately, this means that any theory of the first human language must be based on pretty darn flimsy evidence. This problem was recognized as early as 1866, when the Linguistic Society of Paris prohibited further papers on the topic, and although this ban is no longer heeded, there’s still nothing like consensus on where language came from or what the earliest ones might have sounded like.
But one tantalizing piece of evidence comes from a curious source: the newest languages of the world, like Nicaraguan Sign Language and other creoles, which arise when a group of children make order out of inconsistent linguistic input. We may never know for sure, but perhaps the process of creating a new language from scratch hasn’t changed that much across the millennia.
In the late Pleistocene Epoch (from about 126,000 to 11,700 years ago), the Scandinavian ice sheet covered the northern half of the Netherlands. After this period, a large area in the north of what is now the Netherlands was left covered by moraine (glacial accumulation of earth and rock debris). In the centre and south, the Rhine and Maas rivers unloaded thick layers of silt and gravel transported from the European mountain chains. Later, during the Holocene Epoch (i.e., the past 11,700 years), clay was deposited in the sheltered lagoons behind the coastal dunes, and peat soil often subsequently developed in these areas. If the peat soil was washed away by the sea or dug away by humans (for the production of fuel and salt), lakes were created. Many of these were reclaimed in later centuries (as mentioned above), while others now form highly valued outdoor recreational areas.
Before Babel? Ancient Mother Tongue Reconstructed
The ancestors of people from across Europe and Asia may have spoken a common language about 15,000 years ago, new research suggests.
Now, researchers have reconstructed words, such as "mother," "to pull" and "man," which would have been spoken by ancient hunter-gatherers, possibly in an area such as the Caucusus. The word list, detailed today (May 6) in the journal Proceedings of the National Academy of Sciences, could help researchers retrace the history of ancient migrations and contacts between prehistoric cultures.
"We can trace echoes of language back 15,000 years to a time that corresponds to about the end of the last ice age," said study co-author Mark Pagel, an evolutionary biologist at the University of Reading in the United Kingdom.
Tower of Babel
The idea of a universal human language goes back at least to the Bible, in which humanity spoke a common tongue, but were punished with mutual unintelligibility after trying to build the Tower of Babel all the way to heaven. [Image Gallery: Ancient Middle-Eastern Texts]
But not all linguists believe in a single common origin of language, and trying to reconstruct that language seemed impossible. Most researchers thought they could only trace a language's roots back 3,000 to 4,000 years. (Even so, researchers recently said they had traced the roots of a common mother tongue to many Eurasian languages back 8,000 to 9,500 years to Anatolia, a southwestern Asian peninsula that is now part of Turkey.)
Pagel, however, wondered whether language evolution proceeds much like biological evolution. If so, the most critical words, such as the frequently used words that define our social relationships, would change much more slowly.
To find out if he could uncover those ancient words, Pagel and his colleagues in a previous study tracked how quickly words changed in modern languages. They identified the most stable words. They also mapped out how different modern languages were related.
They then reconstructed ancient words based on the frequency at which certain sounds tend to change in different languages &mdash for instance, p's and f's often change over time in many languages, as in the change from "pater" in Latin to the more recent term "father" in English.
The researchers could predict what 23 words, including "I," "ye," "mother," "male," "fire," "hand" and "to hear" might sound like in an ancestral language dating to 15,000 years ago.
In other words, if modern-day humans could somehow encounter their Stone Age ancestors, they could say one or two very simple statements and make themselves understood, Pagel said.
Limitations of tracing language
Unfortunately, this language technique may have reached its limits in terms of how far back in history it can go.
"It's going to be very difficult to go much beyond that, even these slowly evolving words are starting to run out of steam," Pagel told LiveScience.
The study raises the possibility that researchers could combine linguistic data with archaeology and anthropology "to tell the story of human prehistory," for instance by recreating ancient migrations and contacts between people, said William Croft, a comparative linguist at the University of New Mexico, who was not involved in the study.
"That has been held back because most linguists say you can only go so far back in time," Croft said. "So this is an intriguing suggestion that you can go further back in time."
Obstacles to Successful English-Language Policies
To be sure, one-language policies can have repercussions that decrease efficiency. Evidence from my research at Rakuten—along with a study I conducted with Pamela Hinds of Stanford University and Catherine Cramton of George Mason University at a company I’ll call GlobalTech and a study I conducted at a firm I’ll call FrenchCo—reveals costs that global English-language rules can create. Proper rollout mitigates the risks, but even well-considered plans can encounter pitfalls. Here are some of the most common.
Change always comes as a shock.
No amount of warning and preparation can entirely prevent the psychological blow to employees when proposed change becomes reality. When Marie (all names in this article are disguised, with the exception of Mikitani and Ito) first learned of FrenchCo’s English-only policy, she was excited. She had been communicating in English with non-French partners for some time, and she saw the proposed policy as a positive sign that the company was becoming more international. That is, until she attended a routine meeting that was normally held in French. “I didn’t realize that the very first meeting after the rule came out was really going to be in English. It was a shock,” Marie says. She recalls walking into the meeting with a lot of energy—until she noticed the translator headsets.
“They’re humiliating,” she says. “I felt like an observer rather than a participant at my own company.”
Will Mandarin Be Next?
Given the size and growth of the Chinese economy, why move to an English-only policy? Isn’t it possible that Mandarin could overtake English as the global language of business? It’s possible, but unlikely. There are two reasons for this.
First, English has a giant head start. China can’t replicate Britain’s colonial history. The British Empire began embedding the English language in many parts of the world as early as the 16th century. Philanthropic work by American and British organizations further spread English, long before corporations began to adopt it at the workplace.
Second, for much of the world, Mandarin is extremely difficult to learn. It’s easier to pick up “broken English” than “broken Mandarin.” Knowing Mandarin—or any language spoken by huge numbers of people—is an advantage, clearly. But for now, Mandarin is not a realistic option for a one-language policy.
Compliance is spotty.
An English mandate created a different problem for a service representative at GlobalTech. Based in Germany, the technology firm had subsidiaries worldwide. Hans, a service representative, received a frantic call from his boss when a key customer’s multimillion-dollar financial services operation ground to a halt as a result of a software glitch. Hundreds of thousands of dollars were at stake for both the customer and GlobalTech. Hans quickly placed a call to the technical department in India, but the software team was unable to jump on the problem because all communications about it were in German—despite the English-only policy instituted two years earlier requiring that all internal communications (meetings, e-mails, documents, and phone calls) be carried out in English. As Hans waited for documents to be translated, the crisis continued to escalate. Two years into the implementation, adoption was dragging.
When nonnative speakers are forced to communicate in English, they can feel that their worth to the company has been diminished, regardless of their fluency level. “The most difficult thing is to have to admit that one’s value as an English speaker overshadows one’s real value,” a FrenchCo employee says. “For the past 30 years the company did not ask us to develop our foreign-language skills or offer us the opportunity to do so,” he points out. “Now, it is difficult to accept the fact that we are disqualified.” Employees facing one-language policies often worry that the best jobs will be offered only to those with strong English skills, regardless of content expertise.
When my colleagues and I interviewed 164 employees at GlobalTech two years after the company’s English-only policy had been implemented, we found that nearly 70% of employees continued to experience frustration with it. At FrenchCo, 56% of medium-fluency English speakers and 42% of low-fluency speakers reported worrying about job advancement because of their relatively limited English skills. Such feelings are common when companies merely announce the new policy and offer language classes rather than implement the shift in a systematic way. It’s worth noting that employees often underestimate their own abilities or overestimate the challenge of developing sufficient fluency. (See the sidebar “Gauging Fluency.”)
Progressing from beginner level to advanced—which greatly improves an employee’s ability to communicate—involves mastering around 3,500 words. That’s a far less daunting task than adding the 10,000 words necessary to move from advanced to native speaker, for which the payoff may be lower.
Job security falters.
Even though achieving sufficient fluency is possible for most, the reality is that with adoption of an English-only policy, employees’ job requirements change—sometimes overnight. That can be a bitter pill to swallow, especially among top performers. Rakuten’s Mikitani didn’t mince words with his employees: He was clear that he would demote people who didn’t develop their English proficiency.
It’s not unusual to hear nonnative speakers revert to their own language at the expense of their English-speaking colleagues, often because it’s faster and easier to conduct meetings in their mother tongue. Others may take more aggressive measures to avoid speaking English, such as holding meetings at inopportune times. Employees in Asia might schedule a global meeting that falls during the middle of the night in England, for instance. In doing so, nonnative speakers shift their anxiety and loss of power to native speakers.
Many FrenchCo employees said that when they felt that their relatively poor language skills could become conspicuous and have career-related consequences, they simply stopped contributing to common discourse. “They’re afraid to make mistakes,” an HR manager at the firm explains, “so they will just not speak at all.”
In other cases, documents that are supposed to be composed in English may be written in the mother tongue—as experienced by Hans at GlobalTech—or not written at all. “It’s too hard to write in English, so I don’t do it!” one GlobalTech employee notes. “And then there’s no documentation at all.”
The bottom line takes a hit when employees stop participating in group settings. Once participation ebbs, processes fall apart. Companies miss out on new ideas that might have been generated in meetings. People don’t report costly errors or offer observations about mistakes or questionable decisions. One of the engineers at GlobalTech’s Indian office explained that when meetings reverted into German his ability to contribute was cut off. He lost important information—particularly in side exchanges—despite receiving meeting notes afterward. Often those quick asides contained important contextual information, background analyses, or hypotheses about the root cause of a particular problem. He neither participated in the meetings nor learned from the problem-solving discussions.
Chapter 1.1: General Introduction
Africa. It’s a term that creates a neat box, limited by geographic boundaries. But what attributes does this continent’s name conjure? Your personal experience and exposure define your mental pictures of Africa. If you were raised there, your image is unlikely to be continent-wide. You picture the mango tree outside your house, your father and his friends sitting on mats and talking. Your mind is filled with images of city traffic and the cries of hawkers selling soft drinks, newspapers, tissues, plantain chips, peanuts. You’re transported to a boarding school as you press your uniform, hurrying to line up before the prefect discovers you’re late. If you aren’t African, and you’ve never travelled there, these particular scenes are unlikely to fill your mind. Your thoughts–positive or negative–are shaped by media, your education, and your imagination.
Despite American attempts to make K-12 education broader through multiculturalism, most elementary and high school teachers have studied little about Africa, and are as subject as their students to visual and cultural stereotypes. Why is it so easy to stereotype a whole continent? Partly because our sense of geography is weak. No American–even those who have never travelled–would assume that Icelandic and Greek cultures are identical or even substantially similar. Yet unfamiliarity with African countries and ethnonyms, histories, and cultural distinctions often groups anything from the continent with an adjective no more specific than “African.”
A quick look at a photo taken in space demonstrates just how small Europe is in comparison to Africa (Fig. 1). Print maps have distorted the size relationships of land masses for centuries in order to conveniently show longitude and latitude. A comparative map further indicates just how vast Africa is (Fig. 2). If we can recognize just how different Iceland and Greece are, then why are we so eager to believe that African cultures are similar, or assume that the continent shares a common religion, history, or arts?
Fig. 2. Kai Krause’s “The True Size of Africa.” .
Media imagery has created a picture of Africa that is often out-of-date, exaggerated, or that magnifies issues of one area as if they apply to the continent. Often perspectives deny complexity or are ahistorical, as if Africa has remained unchanged for centuries or longer.
Exercisesl to r: Starving girl, late 1960s, Dr. Lyle Conrad, public domain soldiers, Central African Rep., 2007, Martin H, CC BY-SA 2.0 house, public domain smiling girl, public domain Jasper Beckx portrait of Miguel de Castro, Nationmuseet, Copenhagen, 1643, Creative Commons 0 Bishop Samuel Ajayi Crowther and Son, 1870, public domain Luanda, Angola, 2013, Fabio Vanin, CC BY-SA 3.0 refugee camp Horn of Africa, 2011, Oxfam, CC 3.0 Models, Uganda, 2014, Eguanokla, CC BY-SA 4.0 Our Lady of Peace, Yamoussoukro, Cote d’Ivoire, 2013, jbdodane, CC BY-NC 2.0 Maroko, Nigeria, 2010, Heinrich Boll-Stiftung, CC BY-SA 2.0 giraffe, 2005, Miroslav Duchacek,CC BY-SA 3.0 Maasai warriors, 1921, public domain Nkrumah mausoleum, Accra, Ghana, public domain.
Many parts of Africa were in direct or indirect contact with Europe and Asia. Egypt and some other parts of North Africa were incorporated into the Roman Empire and afterward continued to trade with the Mediterranean world. By the 8th century, Arab-speaking chroniclers recorded information about parts of eastern, northern and western Africa. Ethiopians travelled to Byzantium and the Middle East, as well as India, and Persians and Arabs traded with a number of East African coastal communities, as did the Chinese (Fig. 3). The 15th century saw the beginning of European direct contact with West, then Central, then South and East Africa, as well as travelers’ accounts and documents written by Africans in European languages or Arabic.
Fig. 3. The Da-ming-hun-yi-tu, or Composite Map of the Ming Empire, is the oldest surviving map that shows Africa. Although depicted from an ethnocentric viewpoint (China dwarfs every other known land mass), it demonstrates early Chinese awareness of the continent and even (albeit enlarged) one of its interior lakes, possibly Lake Victoria. Painted on silk, the map is huge, at approximately 12.67 x 6.67 feet. It appears to be a copy of a map from 1389. Wikimedia Commons. Public domain.
Because Europeans were confined primarily to coastal regions, their information about the West and Central African interiors was usually second-hand and often inaccurate. Access restriction until the 19th century prevented them from reaching the hinterland where they could obtain the raw materials they sought: gold, ivory, furs, pepper, and human beings. Coastal merchants, who acted as middlemen, profited from their control of trade. Europeans could not wrest it from them because they were few in number, arrived on floating targets, and were equipped with volatile gunpowder and inaccurate firearms whose reloading time was no match for a well-aimed arrow.
In the 19th century, however, all that changed. The repeating rifle shifted the military advantage, and, as the century wore on, the Maxim mounted machine gun provided even more effective firepower. Two additional shifts earlier in the century provided the military with advance intelligence: missionary and commercial penetration of the interior. Missionaries forged diplomatic alliances, took note of local power structures, and learned new languages. Commercial concerns such as the Royal Niger Company did the same.
While some European powers had gained an earlier foothold in Africa–the Portuguese in Angola and Mozambique, the Dutch at the Cape of Good Hope, the French at St. Louis along the Senegal River, and the turnover of Portuguese-Dutch-Danish-English occupants of coastal Ghanaian forts–the late 19th century produced European determination to carve up the continent into defined spheres of influence. The Berlin Conference of 1884-85 established European borders for French, English, Belgian, German, Portuguese, Italian, and Spanish interests that soon became colonies (Fig. 4).
Fig. 4. Map of colonial Africa as in 1913, with modern borders. Eric Gaba, Creative Commons CC BY-SA 3.0
Colonialism did not actually last long. Most African nations became independent by 1960/61. In some respects, its impact was negligible in others, it had major political and cultural effects. When Europeans took control, they found a continent with varied political systems. Some areas were empires or kingdoms, run by a single ruler and his counselors. Other areas were more egalitarian city-states, run by all adult men or by a gerontocracy. Some ethnic groups operated as single polities, while others comprised multiple states that warred with one another. The arbitrary nature of the Berlin Conference’s borders meant that old states or families might be split into two spheres. It also meant that former rulers might continue as cultural leaders if cooperative, or be dethroned or exiled if resistant. Even those who kept their positions no longer had military or full legal authority, nor did they have the ability to collect taxes. Governments based on the home country’s will were established, and independence did not reinstate traditional rulers to the full powers they had held previously.
Besides new political and court systems, foreign religious and educational systems had major lasting influences. Christianity arrived in Egypt, the Sudan, and Ethiopia in the 4th century, the same period it was recognized officially in Europe, and parts of northern and eastern Africa became Muslim immediately after the Prophet Muhammed’s death. Islam continued to spread into West Africa slowly via North African trade, but Christian missionization exploded in the 19th century, and since the 1970s both faiths have pushed many older religions aside. Advancement in the civil service–whether colonial or independence era–requires mastery of a foreign language that is usually the “language of instruction” in schools. Curricula are based on European models and extend to university level, which means they vary considerably depending on the former colonial power.
Access to international media and more accessible travel or migration have had their own impact. Foreign films, music videos, and clothing jostle with local products. While none of these features means that African culture has been abandoned, it does signify that values have been adjusted, and cultures often compete for supremacy, some winning because of their status as imported novelties. As we’ll see, the visual arts are part of this duality, with retentions of older practices coexisting with new materials, functions, training, and patrons.
There are many different ways of breaking this huge continent into smaller segments for effective discussion. We could look at climate zones: desert, Sahel, savannah, rainforest. We could consider colonial history and examine Anglophone, Francophone, or Lusophone nations. We’re going to take an approach that considers a limited number of geographic zones, dividing the continent into seven sectors. These often include areas that were once part of a large kingdom or kingdoms, or had linked trading patterns, or share certain cultural, linguistic, or historic features–but they are somewhat arbitrary just the same. They are as follows: North Africa, Western Sudan, Upper Guinea Coast, Lower Guinea Coast, Central Africa, Southern Africa, East Africa (Figs. 5-11).
Fig. 5. North Africa: Egypt, Libya, Tunisia, Algeria, Morocco, Western Sahara. These countries have had long relationships with Europe, the Middle East, and countries south of the Saharan desert. The early spread of Islam limited figurative arts, which are forbidden by religion.
Newer techniques cast doubt on Botai
As the 2020s begin, the pace of technological innovation in archaeology continues to accelerate. And new archaeological data have begun to trickle in from understudied areas.
With improving methods, new information has triggered serious doubts about the Botai/Indo-European model about domestication.
In a shocking 2018 study, a French research team revealed that the horses of Botai were in fact not the domestic horse (Equus caballus) at all, but instead Equus przewalskii – the Przewalski’s horse, a wild animal with no documented evidence of management by human societies.
A family of wild Przewalski’s horses at sunset in Khustai National Park, Mongolia, where they have been reintroduced following their near-extinction. William Taylor , CC BY-ND
Another project using ancient DNA analysis of human remains from Botai showed no genetic links between the area’s ancient residents and Indo-European groups, undermining the idea that horse domestication at Botai stimulated a continental dispersal on horseback.
In the ensuing chaos, researchers must now find a way to piece together the horse’s story, and find an explanation that fits these new facts.
Some, including the equine DNA researchers who published the new discoveries, now suggest that Botai represents a separate, failed domestication event of Przewalski’s horse.
Other scholars now seek to reevaluate the archaeological and historical records around the horse’s initial domestication with a more skeptical eye.
As of the writing of this story, the oldest clearly identified remains of the modern domestic horse, Equus caballus, date back only as far as about 2000 B.C. – to the chariot burials of Russia and Central Asia. From here, researchers are scrambling backwards in time, seeking to find the “big bang” of the human-horse relationship.
Pastoral herding is still a key way of life in Mongolia, and horses are important as both livestock and transportation. Orsoo Bayarsaikhan Photography, CC BY-ND
Africa: Human Geography
Africa is sometimes nicknamed the "Mother Continent" as it's the oldest inhabited continent on Earth.
Geology, Geography, Human Geography, Physical Geography, Social Studies, World History
Africa, the second-largest continent, is bounded by the Mediterranean Sea, the Red Sea, the Indian Ocean, and the Atlantic Ocean. It is divided in half almost equally by the Equator. The continent includes the islands of Cape Verde, Madagascar, Mauritius, Seychelles, and Comoros.
The origin of the name &ldquoAfrica&rdquo is greatly disputed by scholars. Most believe it stems from words used by the Phoenicians, Greeks, and Romans. Important words include the Egyptian word Afru-ika, meaning &ldquoMotherland&rdquo the Greek word aphrike, meaning &ldquowithout cold&rdquo and the Latin word aprica, meaning &ldquosunny.&rdquo
Today, Africa is home to more countries than any other continent in the world. These countries are: Morocco, Western Sahara (Morocco), Algeria, Tunisia, Libya, Egypt, Sudan, Chad, Niger, Mali, Mauritania, Senegal, The Gambia, Guinea-Bissau, Guinea, Sierra Leone, Liberia, Côte d&rsquoIvoire, Ghana, Burkina Faso, Togo, Benin, Nigeria, Cameroon, Central Africa Republic, Equatorial Guinea, Gabon, Congo, Democratic Republic of the Congo, Angola, Namibia, Botswana, South Africa, Lesotho, Swaziland, Mozambique, Zimbabwe, Zambia, Malawi, Tanzania, Rwanda, Burundi, Uganda, Kenya, Somalia, Ethiopia, Djibouti, Eritrea and the island countries of Cape Verde, Madagascar, Mauritius, Seychelles, and Comoros.
The African continent has a unique place in human history. Widely believed to be the &ldquocradle of humankind,&rdquo Africa is the only continent with fossil evidence of human beings (Homo sapiens) and their ancestors through each key stage of their evolution. These include the Australopithecines, our earliest ancestors Homo habilis, our tool-making ancestors and Homo erectus, a more robust and advanced relative to Homo habilis that was able to walk upright.
These ancestors were the first to develop stone tools, to move out of trees and walk upright, and, most importantly, to explore and migrate. While fossils of Australopithecines and Homo habilis have only been found in Africa, examples of Homo erectus have been found in the Far East, and their tools have been excavated throughout Asia and Europe. This evidence supports the idea that the species of Homo erectus that originated in Africa was the first to successfully migrate and populate the rest of the world.
This human movement, or migration, plays a key role in the cultural landscape of Africa. Geographers are especially interested in migration as it relates to the way goods, services, social and cultural practices, and knowledge are spread throughout the world.
Two other migration patterns, the Bantu Migration and the African slave trade, help define the cultural geography of the continent.
The Bantu Migration was a massive migration of people across Africa about 2,000 years ago. The Bantu Migration is the most important human migration to have occurred since the first human ancestors left Africa more than a million years ago. Lasting for 1,500 years, the Bantu Migration involved the movement of people whose language belonged to the Kongo-Niger language group. The common Kongo-Niger word for human being is bantu.
The Bantu Migration was a southeastern movement. Historians do not agree on why Bantu-speaking people moved away from their homes in West Africa&rsquos Niger Delta Basin. They first moved southeast, through the rain forests of Central Africa. Eventually, they migrated to the savannas of the southeastern and southwestern parts of the continent, including what is today Angola and Zambia.
The Bantu Migration had an enormous impact on Africa&rsquos economic, cultural, and political practices. Bantu migrants introduced many new skills into the communities they interacted with, including sophisticated farming and industry. These skills included growing crops and forging tools and weapons from metal.
These skills allowed Africans to cultivate new areas of land that had a wide variety of physical and climatic features. Many hunter-gatherer communities were assimilated, or adopted, into the more technologically advanced Bantu culture. In turn, Bantu people adopted skills from the communities they encountered, including animal husbandry, or raising animals for food.
This exchange of skills and ideas greatly advanced Africa&rsquos cultural landscape, especially in the eastern, central, and southern regions of the continent. Today, most of the population living in these regions is descended from Bantu migrants or from mixed Bantu-indigenous origins.
The third massive human migration in Africa was the African slave trade. Between the 15th and 19th centuries, more than 15 million Africans were transported across the Atlantic Ocean to be sold as slaves in North and South America. Millions of slaves were also transported within the continent, usually from Central Africa and Madagascar to North Africa and the European colony of South Africa.
Millions of Africans died in the slave trade. Most slaves were taken from the isolated interior of the continent. They were sold in the urban areas on the West African coast. Thousands died in the brutal process of their capture, and thousands more died on the forced migration to trading centers. Even more lost their lives on the treacherous voyage across the Atlantic Ocean.
The impacts of slavery on Africa are widespread and diverse. Computerized calculations have projected that if there had been no slave trade, the population of Africa would have been 50 million instead of 25 million in 1850. Evidence also suggests that the slave trade contributed to the long-term colonization and exploitation of Africa. Communities and infrastructure were so damaged by the slave trade that they could not be rebuilt and strengthened before the arrival of European colonizers in the 19th century.
While Africans suffered greatly during the slave trade, their influence on the rest of the world expanded. Slave populations in North and South America made tremendous economic, political, and cultural contributions to the societies that enslaved them. The standard of living in North and South America&mdashbuilt on agriculture, industry, communication, and transportation&mdashwould be much lower if it weren&rsquot for the hard, forced labor of African slaves. Furthermore, many of the Western Hemisphere&rsquos cultural practices, especially in music, food, and religion, are a hybrid of African and local customs.
Contemporary Africa is incredibly diverse, incorporating hundreds of native languages and indigenous groups. The majority of these groups blend traditional customs and beliefs with modern societal practices and conveniences. Three groups that demonstrate this are the Maasai, Tuareg, and Bambuti.
Maasai peoples are the original settlers of southern Kenya and northern Tanzania. The Maasai are nomadic pasturalists. Nomadic pastoralists are people who continually move in order to find fresh grasslands or pastures for their livestock. The Maasai migrate throughout East Africa and survive off the meat, blood, and milk of their cattle.
The Maasai are famous for their striking red robes and rich traditional culture. Young Maasai men between the ages of 15 and 30 are known as moran, or &ldquowarriors.&rdquo Moran live in isolation in unpopulated wilderness areas, called &ldquothe bush.&rdquo During their time as moran, young Maasai men learn tribal customs and develop strength, courage, and endurance.
Even though some remain nomadic, many Maasai have begun to integrate themselves into the societies of Kenya and Tanzania. Modern ranching and wheat cultivation are becoming common. Maasai also support more tribal control of water resources. Women are pressuring the tribe for greater civil rights, as the Maasai is one of the most male-dominated societies in the world.
The Tuareg are a pastoralist society in North and West Africa. The harsh climate of the Sahara and the Sahel has influenced Tuareg culture for centuries.
Traditional Tuareg clothing serves historical and environmental purposes. Head wraps called cheches protect the Tuareg from the Saharan sun and help conserve body fluids by limiting sweat. Tuareg men also cover their face with the cheche as a formality when meeting someone for the first time. Conversation can only become informal when the more powerful man uncovers his mouth and chin.
Light, sturdy gowns called bubus allow for cool airflow while deflecting heat and sand. Tuaregs are often called the &ldquoblue men of the Sahara&rdquo for the blue-colored bubus they wear in the presence of women, strangers, and in-laws.
The Tuareg have updated these traditional garments, bringing in modern color combinations and pairing them with custom sandals and silver jewelry they make by hand. These updated styles are perhaps best seen during the annual Festival in the Desert. This three-day event, held in the middle of the Sahara, includes singing competitions, concerts, camel races, and beauty contests. The festival has rapidly expanded from a local event to an international destination supported by tourism.
The Bambuti is a collective name for four populations native to Central Africa&mdashthe Sua, Aka, Efe, and Mbuti. The Bambuti live primarily in the Congo Basin and Ituri Forest. Sometimes, these groups are called &ldquopygmies,&rdquo although the term is often considered offensive. Pygmy is a term used to describe various ethnic groups whose average height is unusually low, below 1.5 meters (5 feet).
The Bambuti are believed to have one of the oldest existing bloodlines in the world. Ancient Egyptian records show that the Bambuti have been living in the same area for 4,500 years. Geneticists are interested in the Bambuti for this reason. Many researchers conclude that their ancestors were likely one of the first modern humans to migrate out of Africa.
Bambuti groups are spearheading human rights campaigns aimed at increasing their participation in local and international politics. The Mbuti, for instance, are pressuring the government to include them in the peace process of the Democratic Republic of the Congo. Mbuti leaders argue that their people were killed, forced into slavery, and even eaten during the Congo Civil War, which officially ended in 2003. Mbuti leaders have appeared at the United Nations to gather and present testimony on human rights abuses during and after the war. Their efforts led to the presence of U.N. peacekeeping forces in the Ituri Forest.
Africa&rsquos history and development have been shaped by its political geography. Political geography is the internal and external relationships between various governments, citizens, and territories.
The great kingdoms of West Africa developed between the 9th and 16th centuries. The Kingdom of Ghana (Ghana Empire) became a powerful empire through its gold trade, which reached the rest of Africa and parts of Europe. Ghanaian kings controlled gold-mining operations and implemented a system of taxation that solidified their control of the region for about 400 years.
The Kingdom of Mali (Mali Empire) expanded the Kingdom of Ghana&rsquos trade operations to include trade in salt and copper. The Kingdom of Mali&rsquos great wealth contributed to the creation of learning centers where Muslim scholars from around the world came to study. These centers greatly added to Africa&rsquos cultural and academic enrichment.
The Kingdom of Songhai (Songhai Empire) combined the powerful forces of Islam, commercial trade, and scholarship. Songhai kings expanded trade routes, set up a new system of laws, expanded the military, and encouraged scholarship to unify and stabilize their empire. Their economic and social power was anchored by the Islamic faith.
Colonization dramatically changed Africa. From the 1880s to the 1900s, almost all of Africa was exploited and colonized, a period known as the &ldquoScramble for Africa.&rdquo European powers saw Africa as a source of raw materials and a market for manufactured goods. Important European colonizers included Britain, France, Germany, Belgium, and Italy.
The legacy of colonialism haunts Africa today. Colonialism forced environmental, political, social, and religious change to Africa. Natural resources, including diamonds and gold, were over-exploited. European business owners benefitted from trade in these natural resources, while Africans labored in poor conditions without adequate pay.
European powers drew new political borders that divided established governments and cultural groups. These new boundaries also forced different cultural groups to live together. This restructuring process brought out cultural tensions, causing deep ethnic conflict that continues today.
In Africa, Islam and Christianity grew with colonialism. Christianity was spread through the work of European missionaries, while Islam consolidated its power in certain undisturbed regions and urban centers.
World War II (1939-1945) empowered Africans to confront colonial rule. Africans were inspired by their service in the Allies&rsquo forces and by the Allies&rsquo commitment to the rights of self-government. Africans&rsquo belief in the possibility of independence was further supported by the independence of India and Pakistan in 1947. Mahatma Gandhi, an Indian independence leader who began his career in South Africa, said: &ldquoI venture to think that the Allied declaration that the Allies are fighting to make the world safe for the freedom of the individual and for democracy sounds hollow so long as India, and for that matter Africa, are exploited by Great Britain.&rdquo
By 1966, all but six African countries were independent nation-states. Funding from the Soviet Union and independent African states was integral to the success of Africa&rsquos independence movements. Regions in Africa continue to fight for their political independence. Western Sahara, for instance, has been under Moroccan control since 1979. The United Nations is currently sponsoring talks between Morocco and a Western Sahara rebel group called the Polisario Front, which supports independence.
Managing inter-ethnic conflict continues to be an important factor in maintaining national, regional, and continent-wide security. One of the chief areas of conflict is the struggle between sedentary and nomadic groups over control of resources and land.
The conflict in Sudan&rsquos Darfur region, for example, is between nomadic and sedentary communities who are fighting over water and grazing rights for livestock. The conflict also involves religious, cultural, and economic tensions. In 2003, the Sudan Liberation Army (SLA) and Justice and Equality Movement (JEM), groups from Darfur, attacked government targets in Sudan&rsquos capital, Khartoum.
The SLA and JEM were from different religious and cultural backgrounds than the government of Sudan. The Darfurians were mostly Christians, while the Sudanese government is mostly Muslim. Darfurians are mostly &ldquoblack&rdquo Africans, meaning their cultural identity is from a region south of the Sahara. The Sudanese government is dominated by Arabs, people from North Africa and the Arabian Peninsula. The SLA and JEM were mostly farmers. They claimed their land and grazing rights were consistently being trespassed by nomadic Arab groups.
The Sudanese government responded violently to the attacks by the SLA and JEM. Many international organizations believe the government had a direct relationship with the Arab Janjaweed. The Janjaweed are militias, or independent armed groups. The Janjaweed routinely stole from, kidnapped, killed, and raped Darfurians to force them off their land. The United Nations says up to 300,000 people have died as a result of war, hunger, and disease. More than 2.7 million people have fled their homes to live in insecure and impoverished camps.
The international community&rsquos response to this conflict has been extensive. Thousands of African Union-United Nation peacekeepers remain in the region. Other groups have organized peace talks between government officials and JEM, culminating in a 2009 peace deal signed in Qatar. The International Criminal Court in The Hague has issued an arrest warrant for Sudanese President Omar al-Bashir for war crimes and crimes against humanity.
As a result of ethnic conflicts like the one in Darfur, Africa has more internally displaced people (IDPs) than any other continent. IDPs are people who are forced to flee their home but who, unlike a refugee, remain within their country&rsquos borders. In 2009, there were an estimated 11.6 million IDPs in Africa, representing more than 40 percent of the world&rsquos total IDP population.
Regional and international political bodies have taken important steps in resolving the causes and effects of internal displacement. In October 2009, the African Union adopted the Kampala Convention, recognized as the first agreement in the world to protect the rights of IDPs.
Africa&rsquos most pressing issues can be framed through the United Nations&rsquo Millennium Development Goals (MDGs). All 192 members of the United Nations and at least 23 international organizations have agreed to meet the goals by 2015. These goals are:
1) eradicate extreme poverty and hunger
2) achieve universal primary education
3) promote gender equality and empower women
4) reduce child mortality rates
5) improve maternal health
6) combat HIV/AIDS, malaria, and other diseases
7) ensure environmental sustainability
8) develop a global partnership for development.
These issues disproportionately affect Africa. Because of this, the international community has focused its attention on the continent.
Many parts of Africa are affected by hunger and extreme poverty. In 2009, 22 of 24 nations identified as having &ldquoLow Human Development&rdquo on the U.N.&rsquos Human Development Index were located in Sub-Saharan Africa. In many nations, gross domestic product per person is less than $200 per year, with the vast majority of the population living on less than $1 per day.
Africa&rsquos committee for the Millennium Development Goals focuses on three key issues: increasing agricultural productivity, building infrastructure, and creating nutrition and school feeding programs. Key goals include doubling food yields by 2012, halving the proportion of people without access to adequate water supply and sanitation, and providing universal access to critical nutrition.
Scholars, scientists, and politicians believe climate change will negatively affect the economic and social well-being of Africa more than any other continent. Rising temperatures have caused precipitation patterns to change, crops to reach the upper limits of heat tolerance, pastoral farmers to spend more time in search of water supplies, and malaria and other diseases to spread throughout the continent.
International organizations and agreements, such as the Copenhagen Accord, have guaranteed funding for measures to combat or reduce the effects of climate change in Africa. Many African politicians and scholars, however, are critical of this funding. They say it addresses the effects of climate change after they occur, rather than creating programs to prevent global warming, the current period of climate change. African leaders also criticize developed countries for not making more of an internal commitment to reducing carbon emissions. Developed countries, not Africa, are the world&rsquos largest producers of carbon emissions.
What is certain is that Africa will need foreign assistance in order to successfully combat climate change. Leaders within Africa and outside it will need to seek greater international cooperation for this to become a reality.
Africa is sometimes nicknamed the "Mother Continent" due to its being the oldest inhabited continent on Earth.
History of Computers: A Brief Timeline
The computer was born not for entertainment or email but out of a need to solve a serious number-crunching crisis. By 1880, the U.S. population had grown so large that it took more than seven years to tabulate the U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card based computers that took up entire rooms.
Today, we carry more computing power on our smartphones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from their humble beginnings to the machines of today that surf the Internet, play games and stream multimedia in addition to crunching numbers.
1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.
1822: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. The project, funded by the English government, is a failure. More than a century later, however, the world's first computer was actually built.
1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM.
1936: Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas.
1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts or shafts.
1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in a Palo Alto, California, garage, according to the Computer History Museum.
1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory.
1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.
1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.
1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum.
1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.
1954: The FORTRAN programming language, an acronym for FORmula TRANslation, is developed by a team of programmers at IBM led by John Backus, according to the University of Michigan.
1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.
1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public.
1969: A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities. Due to the slow nature of the system, it never quite gained traction among home PC users.
1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.
1971: Alan Shugart leads a team of IBM engineers who invent the "floppy disk," allowing data to be shared among computers.
1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.
1974-1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, Radio Shack's TRS-80 &mdash affectionately known as the "Trash 80" &mdash and the Commodore PET.
1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the "world's first minicomputer kit to rival commercial models." Two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.
1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool's Day and roll out the Apple I, the first computer with a single-circuit board, according to Stanford University.
1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished.
1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage.
1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program.
1979: Word processing becomes a reality as MicroPro International releases WordStar. "The defining change was to add margins and word wrap," said creator Rob Barnaby in email to Mike Petrie in 2000. "Additional changes included getting rid of command mode and adding a print function. I was the technical brains &mdash I figured out how to do it, and did it, and documented it. "
1981: The first IBM personal computer, code-named "Acorn," is introduced. It uses Microsoft's MS-DOS operating system. It has an Intel chip, two floppy disks and an optional color monitor. Sears & Roebuck and Computerland sell the machines, marking the first time a computer is available through outside distributors. It also popularizes the term PC.
1983: Apple's Lisa is the first personal computer with a GUI. It also features a drop-down menu and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC is the first portable computer with the familiar flip form factor and the first to be marketed as a "laptop."
1985: Microsoft announces Windows, according to Encyclopedia Britannica. This was the company's response to Apple's GUI. Commodore unveils the Amiga 1000, which features advanced audio and video capabilities.
1985: The first dot-com domain name is registered on March 15, years before the World Wide Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot-coms had been registered.
1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides as speed comparable to mainframes.
1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving rise to the World Wide Web.
1993: The Pentium microprocessor advances the use of graphics and music on PCs.
1994: PCs become gaming machines as "Command & Conquer," "Alone in the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure" are among the games to hit the market.
1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.
1997: Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple's court case against Microsoft in which it alleged that Microsoft copied the "look and feel" of its operating system.
1999: The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires.
2001: Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI.
2003: The first 64-bit processor, AMD's Athlon 64, becomes available to the consumer market.
2004: Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant Web browser. Facebook, a social networking site, launches.
2005: YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based mobile phone operating system.
2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo's Wii game console hits the market.
2007: The iPhone brings many computer functions to the smartphone.
2009: Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features.
2010: Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant tablet computer segment.
2011: Google releases the Chromebook, a laptop that runs the Google Chrome OS.
2012: Facebook gains 1 billion users on October 4.
2015: Apple releases the Apple Watch. Microsoft releases Windows 10.
2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.
2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures." [Computers of the Future May Be Minuscule Molecular Machines]
Additional reporting by Alina Bradford, Live Science contributor.