Quantcast
Channel: Hacker News
Viewing all 25817 articles
Browse latest View live

The Ornithologist the Internet Called a Murderer

$
0
0

Marc Bekoff, a researcher focusing on animal consciousness at the University of Colorado, Boulder, fumed on Huffington Post that killing “‘in the name of education’ or ‘in the name of whatever’ simply needs to stop.” He added, “It is wrong and sets a horrific precedent for future research and for children.”

Colleen O’Brien, a director at PETA, condemned it as “perverse, cruel and the sort of act that has led to the extinction of other animals who were also viewed as ‘specimens.’” All that was needed to document the bird, she argued, was “compassion, awe and a camera, not disregard and a death warrant.”

While Dr. Filardi was still on the mountain, almost entirely off the grid, the rage spread. Tens of thousands of people signed petitions that condemned his actions, and thousands more signed a petition calling for him to be fired, or even jailed.

The museum frantically tried to reach him. People were trying to hack into his Facebook account, which was quickly disabled. The pages of his children were targeted. His wife began receiving phone calls with death threats, at all hours of the night. A petition that stated, “Chris Filardi is a disgrace and frankly does not deserve to breathe another breath,” was signed by 3,798 people.

He descended from the mountaintop into an inferno of hate. “If they wanted to make me feel horrible and more than a little frightened for my family or welfare,” he told me, his voice strained, “it worked.”

When he returned to the museum in New York, police officers told him to “be conscious” of how he entered it, never walking in the front door with the rest of the public, but using a back door instead.

Dr. Filardi hoped the threats were just talk, but he kept thinking back to May 2001, when an eco-terrorist group known as the Earth Liberation Front firebombed the Center for Urban Horticulture at the University of Washington, where Dr. Filardi was a graduate student. The bombing was aimed at the work of a professor they mistakenly believed was releasing genetically engineered poplar trees into the wild. (No one was hurt.)


University of Chicago Drops SAT, ACT Requirement for Admissions

$
0
0

The University of Chicago has dropped an admission requirement for students to submit either SAT or ACT test scores, becoming the most prestigious university to do so and joining hundreds of others in the test-optional movement.

The university’s initiative, announced Thursday, “levels the playing field” for first-generation and low-income students, said James G. Nondorf, dean of admissions and vice president of enrollment and student advancement.

“Some students are good testers, some students are not,” Mr. Nondorf said. “We want to remove any policy or program that we have that advantages one group of students over the other.”

Advocates of the test-optional movement praised the decision, calling it a “major milestone.”

“I think it’ll have an effect across the spectrum. It breaks the ice for this real top-tier of nationally selective colleges,” said Robert Schaeffer, public education director of the National Center for Fair and Open Testing, known as FairTest.

Organizations that administer the ACT and SAT noted that most applicants to four-year colleges go to institutions that rely on the exams to help determine admission.

“Comparing students based on widely different sources of information with no common metric increases the subjectivity of admissions decisions,” ACT spokesman Ed Colby said in a statement.

SAT spokesman Zach Goldberg said that with research on grade inflation showing high-school GPAs are higher than ever, it’s important to have another measure like the SAT.

University of Chicago officials said their plan, called the UChicago Empower Initiative, will begin with the Class of 2023. It also includes providing full-tuition scholarships to first-generation and low-income students whose families earn less than $125,000 and for select children of police officers and firefighters nationwide, according to the university.

Admission requirements at the university include submitting an application and providing a school transcript and two teacher evaluations.

According to FairTest, more than 1,000 accredited, four-year colleges and universities now make decisions about all or many applicants without regard to ACT or SAT test scores. Other universities with optional testing include Bates College, Pitzer College and Wesleyan University. Most of the schools are private.

Mr. Schaeffer said research shows that when schools go test-optional, they increase diversity by race, geography and first-generation students.

“It encourages more kids with real talent to think about applying,” he said.

SpinLaunch has secured $40M to build a machine to hurl rockets into space

$
0
0

Flying cars. Cures for death. And now ... space catapults. Bless you, California, for not letting reality get you down.

On Thursday, a Silicon Valley startup called SpinLaunch Inc. will reveal the first details of its plans to build a machine meant to hurl rockets into space. To achieve that goal, SpinLaunch has secured $40 million from some top technology investors, said Jonathan Yaney, the founder.

The company remains tight-lipped about exactly how this contraption will work, although its name gives away the basic idea. Rather than using propellants like kerosene and liquid oxygen to ignite a fire under a rocket, SpinLaunch plans to get a rocket spinning in a circle at up to 5,000 miles per hour and then let it go—more or less throwing the rocket to the edge of space, at which point it can light up and deliver objects like satellites into orbit.

Why would anyone do such a thing? Well, Yaney is trying to work around the limits that physics have placed on the rocket launch industry for decades. To overcome gravity and Earth’s atmosphere, rockets must be almost perfectly engineered and, even then, can only push a relatively small payload into space. The items carried on a typical rocket, for example, make up less than 5 percent of the rocket’s mass, with the rest going toward fuel and the rocket’s body. (An airplane, by contrast, can dedicate up to half its mass to cargo.)

SpinLaunch’s so-called kinetic energy launch system would use electricity to accelerate a projectile and help do much of the dirty work fighting through gravity and the atmosphere. In theory, this means the company could build a simpler, less expensive rocket that’s more efficient at ferrying satellites. “Some people call it a non-rocket launch,” said Yaney. “It seems crazy. It seems fantastic. But we are actually using relatively low-tech industrial components to break this problem into manageable chunks.”

An impressive group of investors have signed on to support Yaney’s vision. The bulk of the $40 million came from Alphabet Inc.’s GV (formerly Google Ventures), Kleiner Perkins Caufield & Byers and Airbus Ventures.

Over the past few years, the rocket industry has become quite crowded. Following in the footsteps of Elon Musk’s Space Exploration Technologies Corp., dozens of companies have appeared, trying to make small, cheap rockets that can be launched every week or perhaps even every day. These smaller rockets have been built to carry a new breed of shoebox-sized satellites—dubbed smallsats—that are packed full of imaging, telecommunications and scientific equipment. The small rockets, though, are really just miniaturized versions of the large, traditional rockets that have flown for decades. SpinLaunch is an entirely new take on the rocket-launch concept itself.

“We are very intrigued by SpinLaunch’s innovative use of rotational kinetic energy to revolutionize the smallsat market,” Wen Hsieh, a general partner at Kleiner Perkins, said in an emailed statement. “SpinLaunch can be powered by renewable energy sources, such as solar and wind, thereby eliminating the use of toxic and dangerous rocket fuels.”

SpinLaunch has a working prototype of its launcher, although the company has declined to provide details on exactly how the machine operates or will compare to its final system. The startup plans to begin launching by 2022. It will charge less than $500,000 per launch and be able to send up multiple rockets per day. The world’s top rocket companies usually launch about once a month, and most of SpinLaunch’s rivals have been aiming for $2 million to $10 million per launch for small rockets. If the startup were able to reach its goals, it would easily be the cheapest and most prolific small launcher on the market.

The company will, of course, need to build its own launch facility and then prove this technology actually works—no small feat. “We are evaluating five potential launch sites within the United States,” Yaney said.

Yaney grew up in California and has run a variety of businesses, from software makers to construction companies. When it comes to aerospace engineering, he’s self-taught, having pored over textbooks in the years leading up to the founding of SpinLaunch in 2014.

The idea of a rocket slingshot seems like science-fiction, and Yaney has nothing resembling the classic background for a rocket maker. Still, some experts in the field who have seen the prototype were impressed by Yaney and think the company has a fighting chance. One such believer is Simon “Pete” Worden, the former director of NASA’s Ames Research Center and a well-known expert in the aerospace field who’s unaffiliated with SpinLaunch. “It’s a very good approach in my opinion,” Worden said.

We're Looking for Engineering Managers and Leaders Come Join Our Team (YC W12)

$
0
0

The Muse is seeking a talented Engineering Manager who is curious and motivated by leading others to solve tough problems and coaching engineers to realize their full potential. You’ll report to the CTO and be responsible for managing, growing, and organizing the teams responsible for every feature you see on our website (and many you don’t!). Join our team of 118+ smart, curious, respectful, friendly, fun, driven Musers who believe in making the world of work better.

About The Muse

TheMuse.com strives to humanize the career and job search landscape by being a companion to millions of people as they seek continuous career satisfaction—not just another job. Companies partner with The Muse as they look to attract and retain the best talent by leveraging our capabilities to tell authentic and compelling employer stories. We foster meaningful relationships between individuals and companies by enabling them to engage with each other on a human level through storytelling and technology. We do all this because we truly believe life is too short to hate your career!

Founded in 2011, over 50 million users come to TheMuse.com for original career advice from prominent experts, access to the best coaches, and a behind-the-scenes look at job opportunities, and over 700 companies trust The Muse to help them strengthen their employer brands to win the war for talent. The majority of the team is headquartered in New York, NY (37th & Broadway).

How You’ll Make an Impact

  • You’ll coach a team of 7-8 engineers in a full team of 24.
  • You’ll be instrumental in hiring new engineers who fit the needs of the business and make the team culture better.
  • You’ll help your team grow and evolve, and make sure that our engineers have an opportunity to improve professionally.
  • You’ll partner with our Product team to plan, design, and develop all new consumer- and client-facing features.
  • You’ll be responsible for the maintenance of all existing systems, and sunsetting of those systems we no longer need.

Why We’ll Love You

  • You’ve had experience managing engineering teams at a startup.
  • You take a pragmatic approach to technology, and you like it best when it helps make someone else’s life better.
  • You’re familiar with agile management techniques, and understand how to create an effective team structure and workflow.
  • You’re an excellent communicator who wants to work alongside the other teams in the company to help them use data to answer their questions—and to ask better ones.
  • You’re comfortable with, and embrace, remote work and distributed teams.
  • You’re supportive and fair, showing genuine interest in the well-being and success of your team. You set a positive tone and recognize and reward individual contributions.
  • You treat everyone with respect and fairness.
  • Like us, you’re all about mentorship and professional growth, and you understand that a manager’s most important long-term goal is to create more senior engineers.

Why You’ll Love Us

  • We’re fanatical about finding the right tool for each job. We pick technologies because they make sense, and we’re not afraid to try something new when we feel that it could be useful, even if we’ve never used it before.
  • Our team works on problems, not tasks. We like to help other teams turn their needs into great technology, and we give each engineer the freedom to tackle the challenges that they find most interesting.
  • We move fast, with multiple releases every day and an iterative approach to developing new features and measuring their success.
  • You’ll work at a tech company founded by two badass women—Our founders believe transparency is important so they really try to share as much as they can about changes to The Muse strategy, board meetings, and when they are wrestling with big company-wide decisions.
  • The Muse actually has—and sticks to—a “no assholes” policy, so you can come to work everyday knowing you will always be surrounded by good people who genuinely care about you.
  • We invest in growing our people—personally and professionally
  • We offer unlimited vacation—and we mean it!

Our Tech

In an industry full of wordpress installs, our tech stack stands above the rest. We've created a custom web application that builds on:

  • Python 3 and Go on the backend
  • Typescript and react on the frontend
  • PostgreSQL, Elasticsearch and Redis for data management
  • Node.js and Ruby on Rails for various tools and services

At The Muse, we believe that great ideas come from anywhere. We support a collaborative environment and value open participation from individuals with different ideas, experiences, and perspectives. We believe having a diverse team makes The Muse a more interesting and innovative place to work, and we strive every day to make The Muse a welcoming and inclusive place for all.

If this could be your dream job, please submit a cover letter and resume, so we can get to know you a little better.

Frida Kahlo and the birth of Fridolatry

$
0
0

Frida, the unapologetic bitch. Frida, the disabled artist. Frida, symbol of radical feminism. Frida, the victim of Diego. Frida, the chic, gender-fluid, beautiful and monstrous icon. Frida tote bags, Frida keychains, Frida T-shirts, And also, this year’s new Frida Barbie doll (no unibrow). Frida Kahlo has been subject to global scrutiny and commercial exploitation. She has been appropriated by curators, historians, artists, actors, activists, Mexican consulates, museums and Madonna.

Over the years, this avalanche has trivialised Kahlo’s work to fit a shallow “Fridolatry”. And, while some criticism has been able to counter the views that cast her as a naive, infantile, almost involuntary artist, most narratives have continued to position her as a geographically marginal painter: one more developing-world artist waiting to be “discovered”, one more voiceless subject waiting to be “translated”.

Appropriated and exploited... The Frida Kahlo Barbie. Photograph: Barbie/AP

In 1938, Frida Kahlo painted Lo que el agua me dio (What the Water Gave Me), the painting perhaps responsible for launching her international career, but also her international mistranslation. In this self-portrait of sorts, we see Kahlo’s feet and calves inside a bathtub and above them, as if emanating from the steam, a collaged landscape: an erupting volcano out of which a skyscraper emerges; a dead bird resting on a tree; a strangled woman; a Tehuana dress dramatically spread out; a female couple resting on a floating cork. Kahlo was working on Lo que el agua me dio when the French surrealist André Breton arrived in Mexico for a visit. He was transfixed by it. He called Kahlo a “natural surrealist”, and in a brochure endorsing her New York debut at Julien Levy’s gallery in 1938, he wrote: “My surprise and joy were unbounded when I discovered, on my arrival in Mexico, that her work has blossomed forth, in her latest paintings, into pure surreality, despite the fact that it had been conceived without any prior knowledge whatsoever of the ideas motivating the activities of my friends and myself.”

Lo que el agua me dio (What the Water Gave Me). Photograph: Alamy

Though “natural surrealist” was a label that helped translate Kahlo’s paintings for European and American audiences, it was one that she always rejected. To be projected as a “surrealist” in Europe helped audiences to understand her work more immediately – more palatably. She was branded as authentically Mexican, with international flair. But to be seen as a “natural surrealist” also transformed her into a kind of sauvage: unconscious of her talent, unsuspecting of her mastery. After her debut, a Time magazine critic described her work as having “the daintiness of miniatures, the vivid reds and yellows of Mexican tradition and the playfully bloody fancy of an unsentimental child.”

Kahlo was hardly unsuspecting, hardly unconscious of what she was doing and who she was. She knew how to capitalise on the elements of her private life and cultural heritage, curate them carefully, and use them to build her public persona. She was a mestizo, born in Mexico City, who had adopted a traditional Zapotec-Tehuana “look”. Her father, the German-born Carl Wilhelm “Guillermo” Kahlo, was a well-known photographer, and the family lived in a neocolonial mansion in Coyoacán, the famous Casa Azul. Kahlo was very much aware of the complex politics of selfhood she was creating and manipulating. In a 1939 photograph taken during the opening of Kahlo’s first exhibition in Paris, she is posing in front of Lo que el agua me dio. She is wearing a Tehuana dress and her unibrow is underscored with black eyeliner: Frida representing Frida. (It is unclear which one is the artwork.)

The way Kahlo’s work and persona were read in Mexico was of course very different from the way they were translated into other cultural milieux. Just as Breton had attached the category “natural surrealist” to her art and framed her work in a discourse that she herself did not embrace, many others did the same with various aspects of her public and private life.

A post-revolutionary symbol of modernity: The casa-estudio designed for Rivera and Kahlo by Juan O’Gorman. Photograph: Pawel Toczynski/Getty Images

An interesting example of this is the house and studio in Mexico City where she and Diego Rivera lived and worked during some of their most productive years in the 1930s. It was designed by Juan O’Gorman, the young architect who was then pioneering the radical architectural changes that took place in post-revolutionary Mexico City.

Before the Mexican Revolution (1910-20), 19th-century neoclassical and colonial architecture dominated. French-influenced mansions across the city stood like lonely homages to a quickly decaying European noble class, and the family life of the Mexican bourgeoisie played out in the sumptuous and darkened stages of these interiors, with their heavy drapes and excessive ornamentation. But after the revolution new ideas about hygiene, ventilation, comfort, efficiency and simplicity made their way into the city. Houses, and with them daily life, were transformed radically and rapidly.

Attuned to the ideological and architectural changes taking place, the couple asked O’Gorman to design a studio and house for them. He created a space specifically for a couple of painters – at once separated and connected. The buildings were the first in Mexico designed for specific functional requirements: living, painting and showcasing work.

In 1933, a few years after Kahlo and Rivera married, they moved in. Rivera’s area was larger, with more work space. Kahlo’s was more “homely”, with a studio that could transform into a bedroom. A flight of stairs led from her studio to a rooftop, which was connected by a bridge to Rivera’s space. Beyond being a workplace, it became a space for the couple’s extramarital affairs: Rivera, with his models and secretaries; Kahlo, with certain talented and famous men, from the sculptor and designer Isamu Noguchi to Leon Trotsky. Perhaps without knowing it, O’Gorman designed a house whose function it was to allow an “open” relationship.

The house was an emblem of modernity and a kind of manifesto: a solitary example of a new functionalism in a city that was still trying to find a national architectural language that best suited its revolutionary programme. It did not encode traditional values or messages. It simply addressed the practical necessities of its dwellers, was materially efficient (primarily made of reinforced concrete), socially progressive and cheap.

With time, however, as neutral as the buildings may have been intended to be in their architecture, they ended up functioning as a site of Mexican cultural capital, especially one connected to indigenous Mexican craftsmanship. The couple were hosts to visitors who came to see their work and works-in-progress, as well as their collections of arts and crafts: Trotsky, Nelson Rockefeller, Pablo Neruda, John Dos Passos, Sergei Eisenstein, Breton.

Diego Rivera and Frida Kahlo. Kahlo’s self-portrait, The Two Fridas (1939), hangs in the background. Photograph: Hulton Archive/Getty Images

O’Gorman gave Rivera and Kahlo a machine to live in, as Le Corbusier would have had it, but also a machine to translate in. Their home brought in foreignness as much as it served as a platform to project a particular idea of Mexico to the world. More than anything, it provided the stage for the power couple of Mexican modernity: cosmopolitan, sophisticated, well-connected and more Mexican than Mexico. The couple’s ultimate oeuvre was, of course, themselves. Kahlo and Rivera were, perhaps, Mexico’s first performance artists, and their casa-estudio was their very own gallery.

In 1934 the photographer Martin Munkacsi visited Mexico and copiously documented the house and studios. The pictures had been commissioned for Harper’s Bazaar, the New York-based fashion magazine which was directed at an upper-class female audience, mostly American but also French and British. In Harper’s July 1934 issue, a double-page spread titled “Colors of Mexico” displayed three of Munkacsi’s many photographs: one of Kahlo crossing the bridge from one house to the other; one of Rivera working in his studio; and one of Frida climbing the staircase to the roof. In the centre of the layout, there is a large photograph of the couple walking beside the cactus fence; a caption explains “Diego Rivera with Señora Freida [sic] Kahlo de Rivera before the cactus fence of their Mexico City home.”

The buildings were designed to embody a proletkult ideology, resembling a factory or industrial complex, with its visible water tanks, its exposed materials and raised supporting columns. The cacti fence surrounding the house, if seen in relation to it, added to the general industrial feeling. However, Harper’s chose the image that best decontextualised the cacti fence and thus presented it as a folkloric, decorative element. To the right of that central image appeared a series of photographs of barefooted Mexican peasants selling crafts and riding mules.

An accompanying piece written by Harry Block – a New York editor – describes his search for the perfect Mexican sandals: “All Mexico walks on huaraches (pronounced wahratchehs and meaning sandals) …” Juxtaposed with the portrait of Rivera and Kahlo – he, dressed like a European dandy, solid leather shoes included; she, wearing pointy black boots – Block’s ode to the huarache seems rather forced.

The Harper’s piece is a perfect example of how Mexico was perpetuated in such stories as a marginal space, with glimpses of modernity a rare exception to the rule. The magazine shows an utterly foreign Mexico, but in a way that also makes it easier to capture and explain to foreign audiences through its associated cliches. It is a form of translation that simplifies the complex operations that took place in the Rivera-Kahlo home.

A functionalist Mexican house that showcased post-revolutionary art? Impossible! Let’s just use the picture with the cacti.

This instance of colonising narratives in cultural translations was not the end but the beginning. In 2002, Harvey Weinstein’s company distributed the film Frida, starring Salma Hayek, asked for a more-sexy Kahlo – more nudity, less unibrow – and got away with it. In a 2016 concert stunt in Mexico, Madonna pulled a Frida lookalike from the audience, said she was “so excited” to finally meet Frida, and then handed her a banana as a token.

Last Halloween, my 21-year-old niece was dragged by her friend to a New York college party. She wasn’t wearing a costume, was not really in the mood. At some point, a trio of Wonder Women stumbled in: red knee-high boots, star print bikini-bottoms, strapless tops, gold headbands fastened around long blond hair.

One of the three wonders took a long swig from a bottle and almost fell back, suddenly noticing my niece, standing behind her. She turned round and looked straight into her face. She studied it up close. Like many women in my maternal family, my niece inherited a dark, robust unibrow. The Wonder Woman finally said: “Oh my god, it’s Frida Kahlo!”.

Valeria Luiselli’s Tell Me How It Ends is published by 4th Estate. Frida Kahlo: Making Herself Up opens at the V&A, London SW7, on 16 June. vam.ac.uk.

Pure Bash Bible – A collection of pure bash alternatives to external processes

$
0
0

README.md

A [WIP] collection of pure bash alternatives to external processes.

The goal of this repository is to document known and unknown methods of doing various tasks using only built-in bash features. Using the snippets from this guide can help to remove unneeded dependencies from your scripts and in most cases make them that little bit faster. I came across these tips and discovered a few while developingneofetch,pxltrm and some other smaller projects.

The snippets below are linted using shellcheck and tests have been written where applicable. If you're looking to contribute, have a read of theCONTRIBUTING.md. It outlines how the unit tests work and what's required when adding snippets.

If you see something that is incorrectly described, buggy or outright wrong, open an issue or send a pull request. If you know a handy snippet that is not included in this list, contribute!

NOTE: Error handling (checking if a file exists, etc) is not included. These are meant to be snippets you can incorporate into your scripts and not full blown utilities.


Trim leading and trailing white-space from string.

This is an alternative to sed, awk, perl and other tools. The function below works by finding all leading and trailing white-space and removing it from the start and end of the string. The : built-in is used in place of a temporary variable.

Example Function:

trim_string() {# Usage: trim_string "   example   string    ":"${1#"${1%%[![:space:]]*}"}":"${_%"${_##*[![:space:]]}"}"printf'%s\n'"$_"
}

Example Usage:

$ trim_string "    Hello,  World    "
Hello,  World

$ name="   John Black  "
$ trim_string "$name"
John Black

Trim all white-space from string and truncate spaces.

This is an alternative to sed, awk, perl and other tools. The function below works by abusing word splitting to create a new string without leading/trailing white-space and with truncated spaces.

Example Function:

# shellcheck disable=SC2086,SC2048trim_all() {# Usage: trim_all "   example   string    "set -fset -- $*printf'%s\n'"$*"set +f
}

Example Usage:

$ trim_all "    Hello,    World    "
Hello, World

$ name="   John   Black  is     my    name.    "
$ trim_all "$name"
John Black is my name.

Use REGEX on a string.

We can use the result of bash's regex matching to replace sed for a large number of use-cases.

CAVEAT: This is one of the few platform dependant bash features.bash will use whatever regex engine is installed on the user's system. Stick to POSIX regex features if aiming for compatibility.

CAVEAT: This example only prints the first matching group. When using multiple capture groups some modification is needed.

Example Function:

regex() {# Usage: regex "string" "regex"
    [[ $1=~$2 ]] &&printf'%s\n'"${BASH_REMATCH[1]}"
}

Example Usage:

$ # Trim leading white-space.
$ regex '    hello''^\s*(.*)'
hello

$ # Validate a hex color.
$ regex "#FFFFFF"'^(#?([a-fA-F0-9]{6}|[a-fA-F0-9]{3}))$'#FFFFFF

$ # Validate a hex color (invalid).
$ regex "red"'^(#?([a-fA-F0-9]{6}|[a-fA-F0-9]{3}))$'# no output (invalid)

Example Usage in script:

is_hex_color() {if [[ "$1"=~ ^(#?([a-fA-F0-9]{6}|[a-fA-F0-9]{3}))$ ]]; thenprintf'%s\n'"${BASH_REMATCH[1]}"elseprintf'%s\n'"error: $1 is an invalid color."return 1fi
}read -r color
is_hex_color "$color"|| color="#FFFFFF"# Do stuff.

Split a string on a delimiter.

This is an alternative to cut, awk and other tools.

string="1,2,3"# To multiple variables.
IFS=, read -r var1 var2 var3 <<<"$string"# To an array.
IFS=, read -ra vars <<<"$string"

Change a string to lowercase.

CAVEAT: Requires bash 4+

Example Function:

lower() {# Usage: lower "string"printf'%s\n'"${1,,}"
}

Example Usage:

$ lower "HELLO"
hello

$ lower "HeLlO"
hello

$ lower "hello"
hello

Change a string to uppercase.

CAVEAT: Requires bash 4+

Example Function:

upper() {# Usage: upper "string"printf'%s\n'"${1^^}"
}

Example Usage:

$ upper "hello"
HELLO

$ upper "HeLlO"
HELLO

$ upper "HELLO"
HELLO

Trim quotes from a string.

Example Function:

trim_quotes() {# Usage: trim_quotes "string":"${1//\'}"printf"%s\\n""${_//\"}"
}

Example Usage:

$ var="'Hello', \"World\""
$ trim_quotes "$var"
Hello, World

Strip all instances of pattern from string.

Example Function:

strip_all() {# Usage: strip_all "string" "pattern"printf'%s\n'"${1//$2}"
}

Example Usage:

$ strip_all "The Quick Brown Fox""[aeiou]"
Th Qck Brwn Fx

$ strip_all "The Quick Brown Fox""[[:space:]]"
TheQuickBrownFox

$ strip_all "The Quick Brown Fox""Quick "
The Brown Fox

Strip first occurrence of pattern from string.

Example Function:

strip() {# Usage: strip "string" "pattern"printf'%s\n'"${1/$2}"
}

Example Usage:

$ strip "The Quick Brown Fox""[aeiou]"
Th Quick Brown Fox

$ strip "The Quick Brown Fox""[[:space:]]"
TheQuick Brown Fox

Strip pattern from start of string.

Example Function:

lstrip() {# Usage: lstrip "string" "pattern"printf'%s\n'"${1##$2}"
}

Example Usage:

$ lstrip "The Quick Brown Fox""The "
Quick Brown Fox

Strip pattern from end of string.

Example Function:

rstrip() {# Usage: rstrip "string" "pattern"printf'%s\n'"${1%%$2}"
}

Example Usage:

$ rstrip "The Quick Brown Fox"" Fox"
The Quick Brown

Check if string contains a sub-string.

Using a test:

if [[ "$var"==*sub_string* ]];thenprintf'%s\n'"sub_string is in var."fi# Inverse (substring not in string).if [[ "$var"!=*sub_string* ]];thenprintf'%s\n'"sub_string is not in var."fi# This works for arrays too!if [[ "${arr[*]}"==*sub_string* ]];thenprintf'%s\n'"sub_string is in array."fi

Check if string starts with sub-string.

if [[ "$var"== sub_string* ]];thenprintf'%s\n'"var starts with sub_string."fi# Inverse (var doesn't start with sub_string).if [[ "$var"!= sub_string* ]];thenprintf'%s\n'"var does not start with sub_string."fi

Check if string ends with sub-string.

if [[ "$var"==*sub_string ]];thenprintf'%s\n'"var ends with sub_string."fi# Inverse (var doesn't start with sub_string).if [[ "$var"!=*sub_string ]];thenprintf'%s\n'"var does not end with sub_string."fi

Using a case statement:

case"$var"in*sub_string*)# Do stuff
    ;;*sub_string2*)# Do more stuff
    ;;*)# Else
    ;;esac

Assign and access a variable using a variable.

hello_world="test"# Create the variable name.
var1="world"
var2="hello_${var1}"# Print the value of the variable name stored in 'hello_$var1'.printf'%s\n'"${!var2}"

Reverse an array.

Enabling extdebug allows access to the BASH_ARGV array which stores the current function’s arguments in reverse.

Example Function:

reverse_array() {# Usage: reverse_array "array"shopt -s extdebugf()(printf '%s\n'"${BASH_ARGV[@]}"); f "$@"shopt -u extdebug
}

Example Usage:

$ reverse_array 1 2 3 4 5
5
4
3
2
1

$ arr=(red blue green)
$ reverse_array "${arr[@]}"
green
blue
red

Remove duplicate array elements.

Create a temporary associative array. When setting associative array values and a duplicate assignment occurs, bash overwrites the key. This allows us to effectively remove array duplicates.

CAVEAT: Requires bash 4+

Example Function:

remove_array_dups() {# Usage: remove_array_dups "array"declare -A tmp_arrayforiin"$@";do
        [[ "$i" ]] && IFS="" tmp_array["${i:- }"]=1doneprintf'%s\n'"${!tmp_array[@]}"
}

Example Usage:

$ remove_array_dups 1 1 2 2 3 3 3 3 3 4 4 4 4 4 5 5 5 5 5 5
1
2
3
4
5

$ arr=(red red green blue blue)
$ remove_array_dups "${arr[@]}"
red
green
blue

Cycle through an array.

Each time the printf is called, the next array element is printed. When the print hits the last array element it starts from the first element again.

arr=(a b c d)cycle() {printf'%s '"${arr[${i:=0}]}"((i=i>=${#arr[@]}-1?0:++i))
}

Toggle between two values.

This works the same as above, this is just a different use case.

arr=(true false)cycle() {printf'%s '"${arr[${i:=0}]}"((i=i>=${#arr[@]}-1?0:++i))
}

Loop over a range of numbers.

Don't use seq.

# Loop from 0-100 (no variable support).foriin {0..100};doprintf'%s\n'"$i"done

Loop over a variable range of numbers.

Don't use seq.

# Loop from 0-VAR.
VAR=50for((i=0;i<=VAR;i++));doprintf'%s\n'"$i"done

Loop over an array.

arr=(apples oranges tomatoes)# Just elements.forelementin"${arr[@]}";doprintf'%s\n'"$element"done

Loop over an array with an index.

arr=(apples oranges tomatoes)# Elements and index.foriin"${!arr[@]}";doprintf'%s\n'"${arr[$i]}"done# Alternative method.for((i=0;i<${#arr[@]};i++));doprintf'%s\n'"${arr[$i]}"done

Loop over the contents of a file.

whileread -r line;doprintf'%s\n'"$line"done<"file"

Loop over files and directories.

Don’t use ls.

# Greedy example.forfilein*;doprintf'%s\n'"$file"done# PNG files in dir.forfilein~/Pictures/*.png;doprintf'%s\n'"$file"done# Iterate over directories.fordirin~/Downloads/*/;doprintf'%s\n'"$dir"done# Iterate recursively.shopt -s globstarforfilein~/Pictures/**/*;doprintf'%s\n'"$file"doneshopt -u globstar

Read a file to a string.

Alternative to the cat command.

Read a file to an array (by line).

Alternative to the cat command.

# Bash <4
IFS=$'\n'read -d "" -ra file_data <"file"# Bash 4+
mapfile -t file_data <"file"

Get the first N lines of a file.

Alternative to the head command.

CAVEAT: Requires bash 4+

Example Function:

head() {# Usage: head "n" "file"
    mapfile -tn "$1" line <"$2"printf'%s\n'"${line[@]}"
}

Example Usage:

$ head 2 ~/.bashrc# Prompt
PS1=''

$ head 1 ~/.bashrc# Prompt

Get the last N lines of a file.

Alternative to the tail command.

CAVEAT: Requires bash 4+

Example Function:

tail() {# Usage: tail "n" "file"
    mapfile -tn 0 line <"$2"printf'%s\n'"${line[@]: -$1}"
}

Example Usage:

$ tail 2 ~/.bashrc# Enable tmux.# [[ -z "$TMUX"  ]] && exec tmux

$ tail 1 ~/.bashrc# [[ -z "$TMUX"  ]] && exec tmux

Get the number of lines in a file.

Alternative to wc -l.

CAVEAT: Requires bash 4+

Example Function:

lines() {# Usage lines "file"
    mapfile -tn 0 lines <"$1"printf'%s\n'"${#lines[@]}"
}

Example Usage:

Count files or directories in directory.

This works by passing the output of the glob as function arguments. We then count the arguments and print the number.

Example Function:

count() {# Usage: count /path/to/dir/*#        count /path/to/dir/*/printf'%s\n'"$#"
}

Example Usage:

# Count all files in dir.
$ count ~/Downloads/*
232# Count all dirs in dir.
$ count ~/Downloads/*/
45# Count all jpg files in dir.
$ count ~/Pictures/*.jpg
64

Create an empty file.

Alternative to touch.

# Shortest.
:> file# Longer alternatives:echo -n > fileprintf''> file

Get the directory name of a file path.

Alternative to the dirname command.

Example Function:

dirname() {# Usage: dirname "path"printf'%s\n'"${1%/*}/"
}

Example Usage:

$ dirname ~/Pictures/Wallpapers/1.jpg
/home/black/Pictures/Wallpapers/

$ dirname ~/Pictures/Downloads/
/home/black/Pictures/

Get the base-name of a file path.

Alternative to the basename command.

Example Function:

basename() {# Usage: basename "path":"${1%/}"printf'%s\n'"${_##*/}"
}

Example Usage:

$ basename ~/Pictures/Wallpapers/1.jpg
1.jpg

$ basename ~/Pictures/Downloads/
Downloads

Simpler syntax to set variables.

# Simple math((var=1+2))# Decrement/Increment variable((var++))((var--))((var+=1))((var-=1))# Using variables((var=var2*arr[2]))

Ternary tests.

# Set the value of var to var2 if var2 is greater than var.# var: variable to set.# var2>var: Condition to test.# ?var2: If the test succeeds.# :var: If the test fails.((var=var2>var?var2:var))

Convert a hex color to RGB.

Example Function:

hex_to_rgb() {# Usage: hex_to_rgb "#FFFFFF"((r=16#${1:1:2}))((g=16#${1:3:2}))((b=16#${1:5:6}))printf'%s\n'"$r$g$b"
}

Example Usage:

$ hex_to_rgb "#FFFFFF"
255 255 255

Convert an RGB color to hex.

Example Function:

rgb_to_hex() {# Usage: rgb_to_hex "r" "g" "b"printf'#%02x%02x%02x\n'"$1""$2""$3"
}

Example Usage:

$ rgb_to_hex "255""255""255"#FFFFFF

Get the terminal size in lines and columns (from a script).

This is handy when writing scripts in pure bash and stty/tput can’t be called.

Example Function:

get_term_size() {# Usage: get_term_size# (:;:) is a micro sleep to ensure the variables are# exported immediately.shopt -s checkwinsize; (:;:)printf'%s\n'"$LINES$COLUMNS"
}

Example Usage:

# Output: LINES COLUMNS
$ get_term_size
15 55

Get the terminal size in pixels.

CAVEAT: This does not work in some terminal emulators.

Example Function:

get_window_size() {# Usage: get_window_sizeprintf'%b'"${TMUX:+\\ePtmux;\\e}\\e[14t${TMUX:+\\e\\\\}"
    IFS=';t'read -d t -t 0.05 -sra term_sizeprintf'%s\n'"${term_size[1]}x${term_size[2]}"
}

Example Usage:

# Output: WIDTHxHEIGHT
$ get_window_size
1200x800# Output (fail):
$ get_window_size
x

Get the current cursor position.

This is useful when creating a TUI in pure bash.

Example Function:

get_cursor_pos() {# Usage: get_cursor_pos
    IFS='[;'read -p $'\e[6n' -d R -rs _ y x _printf'%s\n'"$x$y"
}

Example Usage:

# Output: X Y
$ get_cursor_pos
1 8

Shorter for loop syntax.

# Tiny C Style.
for((;i++<10;)){ echo"$i";}# Undocumented method.foriin {1..10};{ echo"$i";}# Expansion.foriin {1..10};doecho"$i";done# C Style.
for((i=0;i<=10;i++));doecho"$i";done

Shorter infinite loops.

# Normal methodwhile:;doecho hi;done# Shorter
for((;;)){ echo hi;}

Shorter function declaration.

# Normal methodf(){ echo hi;}# Using a subshellf()(echo hi)# Using arithmetic# You can use this to assign integer values.# Example: f a=1#          f a++f()(($1))# Using tests, loops etc.# NOTE: You can also use ‘while’, ‘until’, ‘case’, ‘(())’, ‘[[]]’.f()if true; thenecho"$1";fif()for i in"$@"; doecho"$i";done

Shorter if syntax.

# One line
[[ "$var"== hello ]] &&echo hi ||echo bye
[[ "$var"== hello ]] && { echo hi;echo there; } ||echo bye# Multi line (no else, single statement)
[[ "$var"== hello ]] && \echo hi# Multi line (no else)
[[ "$var"== hello ]] && {echo hi# ...
}

Simpler case statement to set variable.

We can use the : builtin to avoid repeating variable= in a case statement. The $_ variable stores the last argument of the last successful command. : always succeeds so we can abuse it to store the variable value.

# Example snippet from Neofetch.case"$(uname)"in"Linux" | "GNU"*):"Linux"
    ;;*"BSD" | "DragonFly" | "Bitrig"):"BSD"
    ;;"CYGWIN"* | "MSYS"* | "MINGW"*):"Windows"
    ;;*)printf'%s\n'"Unknown OS detected, aborting...">&2exit 1
    ;;esac# Finally, set the variable.
os="$_"

NOTE: This list does not include every internal variable (You can help by adding a missing entry!).

For a complete list, see:http://tldp.org/LDP/abs/html/internalvariables.html

Get the location to the bash binary.

Get the version of the current running bash process.

# As a string."$BASH_VERSION"# As an array."${BASH_VERSINFO[@]}"

Open the user's preferred text editor.

"$EDITOR""$file"# NOTE: This variable may be empty, set a fallback value."${EDITOR:-vi}""$file"

Get the name of the current function.

# Current function."${FUNCNAME[0]}"# Parent function."${FUNCNAME[1]}"# So on and so forth."${FUNCNAME[2]}""${FUNCNAME[3]}"# All functions including parents."${FUNCNAME[@]}"

Get the host-name of the system.

"$HOSTNAME"# NOTE: This variable may be empty.# Optionally set a fallback to the hostname command."${HOSTNAME:-$(hostname)}"

Get the architecture of the Operating System.

Get the name of the Operating System / Kernel.

This can be used to add conditional support for different Operating Systems without needing to call uname.

Get the current working directory.

This is an alternative to the pwd built-in.

Get the number of seconds the script has been running.

Use read as an alternative to the sleep command.

I was surprised to find out sleep is an external command and isn't a built-in.

Example Funcrion:

read_sleep() {# Usage: sleep 1#        sleep 0.2read -rst "${1:-1}" -N 999
}

Example Usage:

read_sleep 1
read_sleep 0.1
read_sleep 30

Check if a program is in the user's PATH.

# There are 3 ways to do this and you can use either of# these in the same way.type -p executable_name &>/dev/nullhash executable_name &>/dev/nullcommand -v executable_name &>/dev/null# As a test.iftype -p executable_name &>/dev/null;then# Program is in PATH.fi# Inverse.if!type -p executable_name &>/dev/null;then# Program is not in PATH.fi# Example (Exit early if program isn't installed).if!type -p convert &>/dev/null;thenprintf'%s\n'"error: convert isn't installed, exiting..."exit 1fi

Get the current date using strftime.

Bash’s printf has a built-in method of getting the date which we can use in place of the date command in a lot of cases.

CAVEAT: Requires bash 4+

Example Function:

date() {# Usage: date "format"# See: 'man strftime' for format.printf"%($1)T\\n""-1"
}

Example Usage:

# Using above function.
$ date "%a %d %b  - %l:%M %p"
Fri 15 Jun  - 10:00 AM# Using printf directly.
$ printf'%(%a %d %b  - %l:%M %p)T\n'"-1"
Fri 15 Jun  - 10:00 AM# Assigning a variable using printf.
$ printf -v date '%(%a %d %b  - %l:%M %p)T\n''-1'
$ printf'%s\n'"$date"
Fri 15 Jun  - 10:00 AM

Progress bars.

This is a simple way of drawing progress bars without needing a for loop in the function itself.

Example Function:

bar() {# Usage: bar 1 10#            ^----- Elapsed Percentage (0-100).#               ^-- Total length in chars.((elapsed=$1*$2/100))# Create the bar with spaces.printf -v prog  "%${elapsed}s"printf -v total "%$(($2-elapsed))s"printf'%s\r'"[${prog///-}${total}]"
}

Example Usage:

for((i=0;i<=100;i++));do# Pure bash micro sleeps (for the example).
    (:;:) && (:;:) && (:;:) && (:;:) && (:;:)# Print the bar.
    bar "$i""10"doneprintf'\n'

Bypass shell aliases.

# alias
ls# command# shellcheck disable=SC1001\ls

Bypass shell functions.

# function
ls# commandcommand ls

Bandai: A Japanese toymaker’s role in the history of board games (2017)

$
0
0

A Japanese toymaker’s role in the history of board games

If you played NES in the 80s, watched Power Rangers in the 90s, or dipped into the world of Japanese toys in the past five decades, you’ve likely heard of Bandai. Established in 1950, the toymaker managed to insinuate their name into the American consciousness like many other Japanese companies of the time—via toys, television, and videogames. Though much younger than the House of Mario, Bandai’s own rise to toy stardom reads like an alt-history version of Nintendo’s. They came of age with model cars and toys at the same time Nintendo shifted focus to the toy and (non-electronic) game business. Like Nintendo, Bandai attempted to capitalize on the emerging videogame market in the 1970s with their TV Jack ball-and-paddle units. They produced pocket electronic games in the wake of Nintendo’s Game & Watch series. They even tried their hand at licensed versions of American consoles, including the Arcadia 2001, Intellivision, and Vectrex—all of which would fail to make an impact in Japan, in part due to the success of the Family Computer—courtesy of Nintendo, of course.

As Nintendo sought worldwide dominance of the videogame market in the 1980s, Bandai played to their strengths as toymakers, bolstering their brand with popular TV, anime, and manga licenses. Ultraman, Dragon Ball Z, Fist of the North Star, Macross, Gundam, Sailor Moon, and Kamen Rider were all among their stable of toys and models. But Bandai’s rise to toy market supremacy began in 1973, when they became sponsor and exclusive merchandiser of Toei Studio’s Himitsu Sentai Gorenger. Twenty years later, the series would cross over to America as the Mighty Morphin Power Rangers, one of the most successful children’s television shows of all time.

Despite Bandai’s decades of international success, a major part of their history is almost completely unknown outside Japan. Between 1980 and 1990, Bandai released nearly 250 board games under the ジョイファミリー Joy Family, パーティジョイ Party Joy, and affiliated spin-off series. And like their toys and models, these games would feature an impressive breadth of original and licensed characters, from horror stalwarts like Dracula to an up-and-coming pair of action-plumbing brothers named Mario and Luigi. In the U.S., however, Bandai’s board game oeuvre is known only by a few collectors and a handful of entries on Board Game Geek. These omissions overlook not only a landmark in Japan’s board game history, but an important bellwether for changing tastes in popular culture and media during the 1980s.

Old and New Sugoroku

If you trawl Yahoo! Auctions (Japan’s equivalent to eBay) for vintage board games, you’ll often run across the term sugoroku双六 in the item descriptions. Superficially, sugoroku describes a genre of board game that community database sites like Board Game Geek classify as roll/spin and move. Among game designers, the x-and-move genre is relegated to kids’ game status (e.g., Candyland), because the reliance on luck and random die rolls divests players of any strategic options. In many games of this style, she who rolls best wins. Historically, there is some truth to the claim—in the licensing mania that dominated board games in the 60s and 70s, x-and-move games were the design of least resistance for companies that wanted to capitalize on fleeting media trends.

But reducing sugoroku to its mechanical core misses centuries of cultural context. In Japan, the genre traces back to two key stylistic/material variations: 盤双六 ban-sugoroku (or board sugoroku) and 絵双六 e-sugoroku (or picture sugoroku). The former is a Chinese import that dates to at least the 7th century in Japan. An abstract game, ban-sugoroku’s columnar spatial divisions and black & white tokens have clear ties to backgammon, though there are slight rule variations that distinguish it from its ancestor. Like many dice-based games of its vintage, ban-sugoroku waxed and waned in public favor (and legality) due to its associations with gambling.


Playing sugoroku in 17th-century Japan
Playing sugoroku in 17th-century Japan

E-sugoroku, as the name implies, ditched the abstract spaces of ban-sugoroku in favor of vibrant pictorial spaces. These ranged from grids of related vignettes that resemble manga to maps of tourist destinations in Japan. Movement through these spaces could be metaphorical, similar to the slides into moral turpitude in snakes ‘n’ ladders, or representational and literal, for instance, by moving one’s piece from point to point on a real map. Whether literal or metaphorical, in sugoroku, players raced to move their pawns from a starting point along a route to an end goal.

E-sugoroku’s reliance on images tied its growth to technological innovations in printing modes and media. Many sugoroku, for instance, were printed on paper in the ukiyo-e浮世絵 style during Japan’s Edo period (c. 17th to mid-19th century). In the subsequent Meiji period (1868–1912), improvements in printing technology transitioned sugoroku from rarified to mass media. Magazines in particular became the primary vehicle for sugoroku’s distribution and consumption. Publications would include sugoroku supplements to celebrate holidays, instruct young people, sell products, and more.

A lovely detail from a Meiji 44 sugoroku
A lovely detail from a Meiji 44 (1911) sugoroku (Source: sugoroku.net)

Sugoroku’s ties to mass media and printing technology have important implications for how the genre is understood today, as historian Anthony Bryant explains:

Because the game process of sugoroku was moving pieces around a board from one point to a determined exit point, the name “sugoroku” came to be applied to another type of game that should be very familiar to anyone who’s ever played Monopoly, Life, or even Chutes and Ladders. These games in particular were popular among the common townsfolk, as the “playing board” was a printed piece of paper which was cheap, and could be folded up and carried about. The playing pieces could be anything from a distinctive pebble to a coin (the Period equivalents to a little tin top hat, if you will).

One of the most common forms these game boards took was “The Fifty-three Stages of the Tôkaidô,” where each of the possible fifty-one spots between “start” and “finish” depicts a scene of one of the post stations on the great Edo-to-Kyôto trunk road. The start point is at Edo’s Nihon-bashi (station one) and the goal is Kyôto (station fifty-three).

Other game boards include “tours” of famous actors, famous sites and events of the Genpei War, etc. Several post-Period versions are based on events surrounding the tale of the 47 Rônin, and others feature characters in popular contemporary literary works.

Bryant’s quote illustrates how sugoroku is not simply a set of genre or mechanical conventions. Over centuries, the label has come to signify an aggregate of material, procedural, and thematic properties. You can think of sugoroku as “mobile gaming for the people,” i.e., portable, low-cost, quick and easy to play, and tied to popular themes and figures in history, literature, and other media.

Licensed Play

The landscape of Japanese board games in the 1950s, 60s, and 70s looked similar to that of the United States. The advent and unprecedented adoption of television shifted the tenor of games from teaching tools, novel diversions, and sports simulations to cross-media tie-ins that aimed to leverage the fleeting popularity of TV shows and stars. As Polizzi and Schaefer wrote in Spin Again:

Between 1948 and 1958, Americans spent over $15 billion on television sets. As television infiltrated the national psyche, game companies worried that this new “toy” would edge out family game-playing. Little did they realize how revolutionary and positive it would be for the toy industry: inspiring board games and offering an entirely new forum for reaching their true customers (kids!) directly by advertising during their favorite programs. (13)

During television’s pre-regulation era, advertisers could target children directly. In Millenial Monsters, Anne Allison notes how Bandai aggressively capitalized on TV’s popularity in Japan:

Instilling this appetite in children and (re)producing them as avid and future consumers was driven directly by toy companies like Bandai that, by taking over the sponsorship of children’s television shows starting in the 1970s, crafted “money shots” on the screen that would translate, directly and repeatedly, to the desire to buy (their brand of) toy merchandise.

Bandai’s board games naturally followed this strategy, leveraging many of the licenses the company had used for their models and toys.

Among Bandai and its publishing competitors, there was a concurrent shift in the target demographics for board games. What was once the exclusive province of families or children was broadened to include new, and specifically older, players. Bandai’s awkwardly named Game for Adult “if” series (i.e., “What if you played as X in this historical battle…?”) and competitor Epoch’s SLG (i.e., Simulation Games) series shared lineage with both the rules-heavy wargames of American hobbyists and 3M’s groundbreaking, adult-marketed Bookshelf Games. By the end of the 70s, game companies were realizing that adults, university students, and other mature audiences might enjoy games among their peers (or independent of their children).

In short, while Japanese board game publishers were mirroring trends in Western games, they offered a unique cultural spin on licensing and marketing. Where the U.S. veered hard toward D&D-inspired fantasy roleplaying in the 70s, Japan leaned more toward homegrown sci-fi and folklore like Macross or GeGeGe no Kitarō (ゲゲゲの鬼太郎). Japanese board games adopted familiar models but filled them with different content.

The Affordances of ¥1,000

These prevailing trends set the context for Bandai’s Joy Family series, which debuted its first games in 1980 (see note below). The series’ first games covered a range of pop cultural subjects, including horror (ドラキュラゲーム aka Dracula Game), travel (JALPAK 世界一周ゲーム aka JALPAK Around the World Game), manga (Dr.スランプ アラレちゃん ゲーム aka Dr. Slump: Arale-chan Game), and even one of the first videogame licenses (クレイジークライマー aka Crazy Climber). Though ostensibly oversized sugoroku, Bandai’s Joy Family games featured numerous and varied components, colorful art, strong visual design, and novel uses of diorama-like 3D features (many that harken to U.S. board game manufacturer Ideal’s toy/game hybrids of the 1960s). And by all accounts, the Joy Family series was an early success. One of its first games, 1980’s Haunted House おばけ屋敷ゲーム, for instance, features heavily in Japanese popular press histories of board games and had a reissue in the early 2010s.


A Note on Dates and Titles

There are nearly no English sources for Bandai’s Joy Family releases. I’ve derived dates and titles from both primary sources (i.e., buying the games and checking their copyright dates or finding images online) and secondary sources (e.g., Japanese books, blog posts, interviews, etc.). Since my Japanese is elementary, many translations are machine-assisted or rely upon English text printed on the games themselves. When dates are indecipherable or unlisted, I rely on available context clues. Bandai’s logo, for instance, had four distinct designs between 1980 to 1990, so it can provide approximate date ranges. In sum, this is difficult and time-consuming work. The language barrier is steep, and, perhaps worse, Yahoo! Auctioneers have a predilection for blurry, off-axis photos that obscure important text or dates. So if you see any glaring errors or you can help with clarifications, please let me know.


Joy Family games were noticeably larger than their U.S. contemporaries. The standard Milton Bradley box in the 1980s, for instance, measured 19 x 9.5 x 2 in. (48.3 x 24.1 x 5 cm), while the Joy Family version of Super Mario Bros. 2 (スーパーマリオブラザーズゲーム2) from 1986 measured 20 x 12 x 2 in. (50.8 x 30.5 x 5 cm). Similarly, unlike standard single-fold U.S. boards made from chipboard backed with buckram, Joy Family boards were made with a lighter card stock, printed on both sides (with a manga or other illustrated text on the verso), and, in lieu of a flat center fold, folded into a J- or U-shape that wrapped around the game’s interior components. There was often a colorful sliding drawer box (for storing pawns, dice, etc.) that fit the box height- and width-wise. The lighter board construction and standardized storage/component construction must have eased production costs, a trait inherited from the mass media sugoroku found in magazines decades prior.


Family Joy Super Mario Bros. 2
Family Joy Super Mario Bros. 2 (1986)

Joy Family games had strong branding from their inception. Early boxes featured nearly full-size artwork on the cover with a white band along the right edge. At the top of this band was the Joy Family logo: three fanned, red playing cards superimposed by two lines of katakana text, ボードゲーム (board game) and ジョイファミリー (family joy). (Why fanned cards? Interestingly, four of the earliest Joy Family games I’ve found read カードゲーム, or “card game”, rather than “board game” in the logo text, despite being games with boards. Based on the limited contextual evidence I have, my only guess is that these games appear to have used cards, rather than spinners or dice, to dictate movement around the board. Perhaps Bandai considered this to be a novel mechanic for sugoroku.) The logo was typically “Bandai red”, though it could change according to the game’s design.

Depending on their complexity and construction, Joy Family games cost roughly ¥2,000–¥3,500. If this historical currency calculator can be trusted, that range in 1980 equates to about $15–25 in 2015 dollars (though the yen appears to have been much weaker vs. the dollar at that time). For Joy Family’s intended audience of “8 to adult”, that was a pretty reasonable amount. If you were shopping at Target today, you’d spend a little less for Monopoly and a bit more for Settlers of Catan.

However, Bandai’s true price breakthrough happened three years after Joy Family’s debut. In 1983, the Party Joy series number 1, 悪霊島ゲーム Demon Island inaugurated a breakneck, eight-year stretch of continual board game releases. Priced at a uniform ¥1,000 (~$7), the Party Joy series squarely targeted the allowances of its primary school demographic.

Party Joy games had a prominent logo mark that echoed the red, tripartite brand of its Joy Family sibling. But instead of cards, the logo resembled a folded map, referencing the play board tucked within its diminutive sliding box. Beneath the katakana パーティジョイ Party Joy, Bandai included a small yellow circle with the series number, a sly marketing tactic that certainly compelled more obsessive compulsive children (and certain older game scholars) to collect as many as they could.


Party Joy 1 Demon Island
Party Joy #1 悪霊島ゲーム Demon Island contained a foldout board, instructions, two types of cards, a die, and four plastic player pawns

But the Party Joy games’ most striking material feature was their size. Each game measured 8.5 x 6.2 x 1 in. (22 x 15.5 x 2.8 cm), only slightly smaller than B5 paper size. For a Japanese child in the 80s, this would have traveled conveniently alongside both the pervasive, B5-sized Japonica learning books (ジャポニカ学習帳) or the latest manga digest. Like the sugoroku of decades past, Party Joy were designed for simplicity and travel. And as commodities, they were designed for collection, display, and licensing appeal.


Multiple Party Joy games on a bookshelf
A row of Party Joy spines arranged sequentially on a bookshelf

Unlike Joy Family’s traditional lidded box design, Party Joy games had an inner tray that slid out from the outer box’s right edge. The tray was initially printed cardboard that duplicated the box artwork in monotone and had a small adhesive pull tab to help grasp the inner box. In 1985, Bandai replaced the cardboard with molded plastic trays in pastel hues of green, blue, yellow, and (for at least one game) pink. The trays included a short folding flap on the right side that clicked into place when closed. At the top of this flap was an embossed Party Joy logo, and at its center, replacing the green pull tab, was a ridged thumb grip. The tray’s bottom had six recessed posts, four at the corners and two at the center of each longer side, as well as a small center hole. This clever design feature allowed the tray to serve as a platform for game accessories. The same plastic standees that held players’ cardboard pawns would fit into the recessed holes in order to hold, for example, cardboard walls for a 3D diorama. Similarly, the center hole could be used to fix a through-hole spinner mechanism, converting an unused storage tray into an active game component.


Party Joy 49 assembled
Party Joy #49 ケーキがいっぱーい uses the sliding tray as a platform to support its three-dimensional components

The Party Joy box became the key affordance for Bandai’s game designers. All components necessary for play, from the rule book to the game board, had to fit within the small sliding compartment. As the logo indicated, double-sided game boards were printed on thick card stock that could be folded or cut into, say, fourths or sixths, and stored within the tray. To further save space and manufacturing costs, playing cards were printed on perforated sheets that players would separate during setup. These printed sheets allowed Bandai to deviate from standard playing card size, so a single game could have myriad card types and shapes—standard, square, miniature, and more. Other standardized elements would appear across multiple games (and even non-Bandai games), including four plastic standees in primary colors (yellow, blue, red, green) and a miniature six-sided die with a large red circle for the one face, presumably evoking the Japanese flag.


Party Joy 37
Party Joy #37 モンスター学園 contains foldable cardboard elements that assemble into a three-dimensional board

Most Party Joy games used sugoroku-style mechanics. With few exceptions, either dice rolls or spinners drove player movement, while cards and other accessories factored in additional chance or strategy mechanics. Owing to the series’ target audience, Bandai’s design focus was on simple setup and play, eye-catching artwork, and novel, toy-like constructions. In survey postcards included in many Party Joy games, Bandai asked, “What kind of game would you like to see in the future?” The answer choices included, “more three-dimensional,” “quick play,” “ more thrilling game subjects,” “fun board designs,” and “nice components.”


Party Joy 42 spinner assembly
A detail of the Party Joy #42 ぼくら少年探偵団ゲーム 犯人はだあ~れだ? spinner assembly that attaches to the bottom of its plastic tray

While Bandai’s idea of “thrilling subjects” predictably included licensed properties, the range of genres the series covered in 135 games is impressive, spanning horror, sports, travel, adventure, exploration, cooking, fantasy, humor, racing, and more. At the series’ peak, Bandai was releasing a game or more per month, so their stable of designers were expected to churn out game prototypes at a remarkable pace. In a 2015 interview, game designer Ikuo Nomura explains that he was among several outsourced contractors that Bandai hired to create their Joy-affiliated games. Due to limitations on copyright terms, game production had a short life span. No matter how well a particular game sold, after six months, it was taken out of production. As a result, Nomura explains, he had to focus on game designs that, inspired by sugoroku, did not require extensive rules to play. But conversely, speedy turnarounds allowed the designers to experiment with mechanics and styles, because failures and successes alike would be short-lived.

Nomura describes Party Joy’s material limitations as a kind of “platform”. Since the game’s price was fixed at ¥1,000, he explains, this set a hard limit on how much paper he could use or how many components he could include. Nomura tended to use spinners rather than dice, for example, because they were cheaper, and he could weight the random results easier. And no matter how clever a game component might be, it had to fit within the plastic tray. Ultimately, these same constraints would lead to the series’ demise—at ¥1,000, it was difficult to design board games with enough strategic interest to encourage long-term play. Such limitations would be a significant handicap as videogames began to attract young peoples’ time and money.

Joy Family vs. Family Computer

Bandai’s Party Joy and Nintendo’s Family Computer both arrived in 1983, and the impact of the latter would eventually spell defeat for the former. Videogames certainly didn’t catch Bandai by surprise. They’d been trying to make the new medium a viable business since its inception, dipping into consoles, electronic handhelds, and third-party development. In fact, the Joy Family series’ Crazy Climber, released in 1981, was among the first, if not the first, licensed board game adaptation of a popular arcade game. But it wasn’t until the Famicom gained momentum that Bandai’s videogame licensing kicked into high gear.

Nintendo’s Family Computer initially didn’t set the homeland on fire, but by 1985, buoyed by the breakout phenomenon of Super Mario Bros., the landscape of Japanese videogames was irrevocably changed. Japan already had a vibrant arcade culture, but Famicom brought the arcade home for millions of eager players. Bandai showed no hesitation licensing this new trend. Party Joy #51, スーパーマリオブラザーズ Super Mario Bros., was released the same year as the Famicom cartridge and featured the same cover artwork. Propped side-by-side, it’s remarkable how the Party Joy game looks like a deluxe version of the Famicom game, and Bandai certainly didn’t miss this comparison either. In the next year, Party Joy releases were almost exclusively dedicated to videogame licenses, including ツインビーゲーム TwinBee (#56), ゲゲゲの鬼太郎 妖怪大魔境 aka Ninja Kid (#58), グラディウス Gradius (#60), ゼルダの伝説ゲーム aka The Legend of Zelda (#61), マイティボンジャックゲーム Mighty Bomb Jack (#62), 謎の村雨城ゲーム aka The Mysterious Castle Murasame (#63), and 謎の村雨城ゲーム aka Commando (#64). On the Joy Family side, there were adaptations of Super Mario Bros. (in a DX, or deluxe, version), Super Mario Bros. 2, and 魔界村 aka Ghosts ‘n Goblins. Whether these adaptions proved more or less popular than previous board games is unknown, but after Bandai’s initial licensing barrage, their videogame licenses precipitously decreased.


Party Joy Super Mario Bros
Party Joy #51 スーパーマリオブラザーズ Super Mario Bros.

Bandai’s competitors tried to hitch their wagon to videogames as well. Takahashi released a six-part Family Computer Board Game series clad in silver boxes that were nearly indistinguishable from Famicom boxes apart from their size, which was identical to Party Joy boxes. Namco decided to release board game versions of their own properties in two short-lived runs. Their Fantasy Board Game series released conversions of Tower of Druaga, Dragon Buster, and Pac-Land in 1986, followed by the Namcot Handy Board Game series, comprising only two games: Valkyrie’s Adventure and Super Xevious. (In English, “Handy Board” reads like an awkward linguistic construction for “portability,” but in Japanese, the name was strategic. Note that the characters for ハンディ “handy,” at a glance, are strikingly similar to the パーティ “party” in Party Joy, plus and minus a few diacritics.) A few years later, Enix would adapt their Dragon Quest franchise to multiple board and card games (which continue today), but beyond a few one-offs, no other company besides Bandai would attempt such a sustained videogame licensing schedule.


Mario Bros Family Computer board game
Games in the Family Computer Board Game series, like Mario Bros., are identical in size to Bandai's Party Joy games

One recent account of Japanese board game history (日本懐かしボードゲーム大全) claims that the rise of videogame adaptations demonstrates how the Famicom bolstered rather than damaged the 80s board game market. While key licenses may have driven some videogame players to board game adaptations, the market crossover must have had a significant impact on Bandai’s bottom line. Time spent playing videogames was time not spent playing board games. And despite the price differential between a Party Joy box and a Famicom cassette, millions of consumers clearly didn’t mind paying 3x or 4x more for the latest electronic diversion. Bandai obviously didn’t abandon the board game business post-1986, but Joy Family’s and Party Joy’s balance of original versus licensed games clearly swung toward the latter, signaling the need to tie their series to established media properties. But even Gundam couldn’t keep Joy afloat—by 1990, all of Bandai’s board game series were sputtering to a finish. Releases became more staggered (there were only three Party Joy games in ’90, including Super Mario World), licenses were less diverse (nearly every property was owned by Toei Animation), and sequels were prevalent. Consumer entertainment dollars were clearly shifting to videogames.

One of Bandai’s most interesting last-ditch efforts to revive the line was the spin-off パーティジョイ指南役 Party Joy Instructor series, which began in 1991. The games were all licensed Super Famicom adaptations (including Nintendo’s ゼルダの伝説・神々のトライフォース aka The Legend of Zelda: A Link to the Past and Capcom’s Rockman 4) sheathed in silver boxes featuring a prominent line drawing of a SFC game cartridge. The “Instructor” gimmick was a hybrid board game/strategy guide. Original artwork was interspersed with in-game screen captures, and portions of the instruction manual and game board were dedicated to strategies for the titular videogame. Ostensibly, while you were playing the board game, you were learning skills that would transfer to the console. Whether the Instructor series served its intended purpose is unclear, but judging by the series’ quick demise after seven games (and their relative scarcity today), videogame players weren’t biting.


Party Joy Instructor Zelda
Party Joy Instructor No. 6 ゼルダの伝説・神々のトライフォース aka The Legend of Zelda: A Link to the Past

Party Joy Instructor Zelda map
Party Joy Instructor game artwork incorporated game hints, strategies, and screenshots, as seen on the Legend of Zelda board

Bandai’s decade-long run of board game releases are remarkable not only for their sheer quantity, but for their breadth of genres, their shrewd attention to construction and design, and their willingness to experiment within proscribed constraints. It’s easy to dismiss these games as children’s fare or simplistic spin-and-move games, but there’s a rich vein of Japanese culture hiding beneath these mere sugoroku. Just as picture sugoroku superseded board sugoroku as players’ tastes changed, a significant sea change in entertainment consumption re-shaped board games’ design parameters in 1980s Japan. While prior media like television, manga, and anime had provided lucrative fodder for board game design in prior decades, videogames proved to be a competitor hewn too close to board games’ procedural core. Bandai tried both cooption and conciliation as strategies to compete with the new medium, but concession proved to be the only viable course. And while they ultimately didn’t survive videogames’ market incursion, Bandai’s board game adaptations are a significant, yet still largely overlooked, branch of (video)game history.

Croma: Elixir macro utilities to make type-based programming easier

$
0
0

README.md

Elixir macro utilities to make type-based programming easier.

Hex.pmHex.pmBuild StatusCoverage Status

Usage

  • Add :croma as a mix dependency.
  • Run $ mix deps.get.
  • Add use Croma in your source file to import/require macros defined in croma.
  • Hack!

Croma.Result

  • Croma.Result.t(a) is defined as @type t(a) :: {:ok, a} | {:error, any}, representing a result of computation that can fail.

  • This data type is prevalent in Erlang and Elixir world. Croma makes it easier to work with Croma.Result.t(a) by providing utilities such as get/2, get!/1, map/2, map_error/2, bind/2 and sequence/1.

  • You can also use Haskell-like do-notation to combine results of multiple computations by m/1 macro. For example,

    Croma.Result.m do
      x <- {:ok, 1}
      y <- {:ok, 2}
      pure x + yend

    is converted to

    Croma.Result.bind(mx, fn x ->Croma.Result.bind(my, fn y ->Croma.Result.pure(x + y)end)end)

    and is evaluated to {:ok, 3}. (The do-notation is implemented by Croma.Monad.)

Croma.Defun : Typespec-oriented function definition

  • Annotating functions with type specifications is good but sometimes it's a bit tedious since one has to repeat some tokens (names of function and arguments, etc.) in @spec and def.

  • defun/2 macro provides shorthand syntax for defining function with its typespec at once.

    • Example 1

      useCroma
      defun f(a :: integer, b ::String.t) ::String.t do"#{a}#{b}"end

      is expanded to

      @spec f(integer, String.t) ::String.tdeff(a, b) do"#{a}#{b}"end
    • Example 2 (multi-clause syntax)

      useCroma
      defun dumbmap(as :: [a], f :: (a -> b)) :: [b] whena: term, b: term do
        ([]     , _) -> []
        ([h | t], f) -> [f.(h) | dumbmap(t, f)]end

      is expanded to

      @spec dumbmap([a], (a -> b)) :: [b] whena: term, b: termdefdumbmap(as, f)defdumbmap([], _) do
        []enddefdumbmap([h | t], f) do
        [f.(h) | dumbmap(t, f)]end
  • In addition to the shorthand syntax explained above, defun is able to generate code for runtime type checking:

    • guard: soma_arg :: g[integer]
    • validation with valid?/1 of a type module (see below): some_arg :: v[SomeType.t]
  • There are also defunp and defunpt macros for private functions.

Type modules

  • Sometimes you may want to have more fine-grained control of data types than is allowed by Elixir's typespec. For example you may want to distinguish "arbitrary String.t" with "String.t that matches a specific regex". Croma introduces "type module"s in order to express fine-grained types and enforce type contracts at runtime, with minimal effort.

  • Leveraging Elixir's lightweight syntax for defining modules (i.e. you can easily make multiple modules within a single source file), croma encourages you to define lots of small modules to organize code, especially types, in your Elixir projects. Croma expects that a type is defined in its dedicated module, which we call a "type module". This way a type can have associated functions within its type module.

  • The following definitions in type modules are used by croma:

    • @type t
      • The type represented in Elixir's typespec.
    • valid?(any) :: boolean
      • Runtime check of whether a given value belongs to the type. Used by validation of arguments and return values in defun-family of macros.
    • new(any) :: {:ok, t} | {:error, any}
      • Tries to convert a given value to a value that belongs to this type. Useful e.g. when converting a JSON-parsed value into an Elixir value.
    • default() :: t
      • Default value of the type. Used as default values of struct fields.

    @type t and valid?/1 are mandatory as they are the raison d'etre of a type module, but the others can be omitted. And of course you can define any other functions in your type modules as you like.

  • You can always define your type modules by directly implementing above functions. For simple type modules croma prepares some helpers for you:

    • type modules of built-in types such as Croma.String, Croma.Integer, etc.
    • helper modules such as Croma.SubtypeOfString to define "subtype"s of existing types
    • Croma.Struct for structs
    • ad-hoc module generator macros defined in Croma.TypeGen

Croma.SubtypeOf*

  • You can define your type module for "String.t that matches ~r/foo|bar/" as follows (we use defun here but you can of course use @spec and def instead):

    defmoduleMyString1do@type t ::String.t
      defun valid?(t :: term) :: boolean do
        s when is_binary(s) -> s =~~r/foo|bar/
        _                   ->falseendend
  • However, as this is a common pattern, croma provides a shortcut:

    defmoduleMyString2douseCroma.SubtypeOfString, pattern:~r/foo|bar/end
  • There are also SubtypeOfInt, SubtypeOfFloat and so on.

Croma.Struct

  • Defining a type module for a struct can be tedious since you have to check all fields in the struct.

  • Using type modules for struct fields, Croma.Struct generates definition of type module for a struct.

    defmoduleIdouseCroma.SubtypeOfInt, min:1, max:5, default:1enddefmoduleSdouseCroma.Struct, fields: [i:I,f:Croma.Float,
      ]endS.valid?(%S{i:5, f:1.5})         # => trueS.valid?(%S{i:"not_int", f:1.5}) # => false
    
    {:ok, s} =S.new(%{f:1.5})        # => {:ok, %S{i: 1, f: 1.5}}# `update/2` is also generated for convenienceS.update(s, [i:5])                # => {:ok, %S{i: 5, f: 1.5}}S.update(s, %{i:6})               # => {:error, {:invalid_value, [S, I]}}

Croma.TypeGen

  • Suppose you have a type module I, and suppose you want to define a struct that have a field with type nil | I.t. As nilable fields are common, defining type modules for all nilable fields introduces too much boilerplate code.

  • Croma has a set of macros to define this kind of trivial type modules in-line. For example you can write as follows using nilable/1:

    defmoduleSdouseCroma.Struct, fields: [i:Croma.TypeGen.nilable(I),
      ]end

Notes on backward compatibility

  • In 0.7.0 we separated responsibility of validate/1 into valid?/1 and new/1.
    • Although older type module implementations that define validate/1 should work as before, please migrate to the newer interface by replacing validate/1 with valid?/1 and optionally new/1.
    • In 0.8.0 we removed support of validate/1.

On Intelligence in Cells: The Case for Whole Cell Biology (2009) [pdf]

Apple’s ‘Behind the Mac’ ads have a double meaning

$
0
0

Apple just released four new ads focused on the Mac. The ads are teeming with emotion, showing earnest people doing creative things behind their Mac computers. Unfortunately, the series is dubbed ‘Behind the Mac’ at a time when many worry that Apple has lost the plot causing the Mac to fall behind the competition.

Each YouTube video links out to Apple’s Mac page, a page that’s headlined by the $5,000 iMac Pro. However, as noted by Quentin Carnicelli over at Rogue Amoeba, the iMac Pro is the only macOS computer to get an update in the last year. The computers featured in Apple’s new ads are all MacBooks.

Right now, Apple’s Mac computers are plagued by a series of concerns. Off the top of my head:

  • The MacBook Pro is a not a computer made for professionals.
  • TouchBar, lol.
  • Mac Pro, ugh.
  • Why is Apple still selling a giant, under-specced, and over-priced Mac Mini that hasn’t been updated or seen a price drop in over four years?
  • When will apple fix the questionable MacBook keyboards?
  • Why hasn’t Apple update its Macs with the latest Intel CPUs yet?

This image from MacRumor’s excellent buyer’s guide sums up the situation nicely:

Caution!
Image: MacRumors Buyer’s Guide

I’m sure the message we’re supposed to take away from the new ad campaign is that Apple is committed to the Mac platform, despite evidence to the contrary. Great. But instead of new ads, wouldn’t it be better if Apple released some new Macs instead?

LearnSnake: Teaching AI to Play Snake Using Reinforcement Learning (Q-Learning)

$
0
0

This is a implementation of an Artificial Intelligence fully written in Javascript that learns to play Snake using Reinforcement Learning.

Play with it…

  • Learning Rate: How aggressive the AI will learn to play (close to 0 will be too slow, close to 1 will simply replace the old learned value with the new). Higher is not necessarily better

  • Discount Factor: Importance between immediate rewards and future rewards

  • Action Randomization: Percentage of time a random action will be executed instead of the desired action

You can check the GitHub Repository to see the source code of this project.

If you want to play some snake yourself, you should try clicking the canvas and entering the Konami Code! See what happens…


The first question I was asked when I came up with this idea: “Why do you want to use Javascript instead of Python?”. I know Python has some libraries for scientific computing (like NumPy and TensorFlow), but I wanted something that worked right out of the box: no installation, running directly in the browser. This way, anyone can test and play with it without worrying about setting up a proper environment.

Snake (the game itself)

Snake is a game in which a snake needs to explore an environment and catch the fruit without hitting any obstacle or itself. Every time the snake catches a fruit, its size increases (for practical reasons explained in the next section, the snake has a fixed size in the live example above).

I could have forked another Snake repository, but since I didn’t know Javascript (and I would need to use it on the next steps), I thought that developing the Snake game from scratch would be a good idea to learn more about it.

To develop this part of the project, I used these contents for guidance:

See game.js for the source code.

Q-Learning

Accordingly to Christopher Watkins, “Q-Learning is a simple way for agents to learn how to act optimally in controlled Markovian domains” (Watkins, 1989). In simple terms, it uses a MDP (Markov Decision Process) to control and make decisions in an environment. It consists of a Q-Table that is constantly updated.

A Q-Table is a matrix with a set of states and respective action’s probability of success. When the agent explore the environment, the table is updated. The action with the biggest value is considered the best action to make.

Example of a QTable (the best action is highlighted)
Example of a QTable (the best action is highlighted)

In this guide I will explain how I applied Q-Learning in the Snake game. If you want to understand more deeply (yet in a simple way) about Q-Learning and Reinforcement Learning, I suggest this Medium post by Vishal Maini.

See q-learning.js for the source code.

Algorithm

The algorithm consists of:

s = game.state()get state s
act = best-action( Q(s) )execute best action act given state s
rew = game.reward( )receive reward rew
s’ = game.state()get next state s’
Q(s, act) = update-qTable( )update Q-Table value q of state s and action act

Update Q-Table

The new Q-Table value for the action accomplished is given by this formula (taken from the article by Vishal Maini):

It’s executed after the action is taken and the reward is known.

Actions

Available actions are “UP”, “DOWN”, “LEFT” and “RIGHT”, simulating a user interaction with the game.

The best action is selected by choosing the biggest action value in a certain state in the Q-Table. If the maximum value equals 0, a random action is selected.

Reward

The only reward is given when the snake grabs the fruit ( +1 ). I tried also giving a small reward (approximately +0.1) when it successfully explored the environment without dying, but the result was that the snake only moved on the environment in circles without really caring about the fruit.

The penalty happens whenever the game resets ( -1 ), that is, the snake hits its tail or a wall.

If anything else happens, there’s no reward ( 0 ).

ActionReward
Catch the Fruit+ 1
Hits tail- 1
Hits wall- 1
Else0

States

First I tried creating a dictionary of states based on all the snake positions and trail formats. Although it worked, because of the high number of states, it was necessary to let the code train for a very long time. Since this project was intended to give a fast explanation and to show a fast result to the user (and the results are not saved across sessions), this way to save states was not the best approach.

To work around this limitation, the tail got a fixed size and the dictionary of states is based only in the relative position of the fruit to the head of the snake and the relative position of the last tail section to the head of the snake.

In this way, the dictionary of states store the name like this:

The Q-Table is stored using a Javascript Object and looks like this:

Example of a QTable in the code
Example of a QTable in the code

If you want to test this algorithm with a full set of states, you can clone the project on GitHub and change a few lines to see how it behaves.


  • Q-Learning-Python-Example (an implementation of Q-Learning for the game “Catch the Ball”, which I used to understand the algorithm steps)

Ultibo – Environment for embedded development on Raspberry Pi

$
0
0

ULTIBO-ORG-250x88.png


Welcome to the Ultibo wiki, the place to find detailed information about Ultibo, what it can do and how to use it. This content will be continually expanding as development of Ultibo core progresses, check back regularly to stay up to date with the latest information.

What is Ultibo?


Ultibo core is a full featured environment for embedded or bare metal (without an operating system) development on Raspberry Pi (all models). It is not an operating system itself but provides many of the same services as an OS such as memory management, threading, networking and file systems. Primarily intended to make development of embedded applications as similar as possible to development for other platforms by providing common features and services and eliminating the need to begin from scratch, Ultibo core can be used in many different ways.

You may want to use it simply to develop applications that take advantage of the extensive API to create connected devices, or Internet of Things (IoT), which support common hardware such as USB and standard protocols like IP, TCP and UDP or physical computing via GPIO control. Alternatively you might use it to experiment with operating system design and embedded development, avoiding the need to create everything from scratch and allowing you to focus on trying out ideas and concepts in an environment where the basics are already taken care of.

More advanced users may opt to use Ultibo core as a base for exploring specific ARM technologies such as TrustZone and Hypervisor mode, because you have complete and unrestricted hardware access you can "take over" one or more CPUs for example to use as you require and still allow Ultibo core to provide basic services like network connectivity and logging.

While Ultibo core is not designed as a real time operating system (RTOS) it offers unrestricted access to hardware and allows the option for including real time components without the need to circumvent the OS functionality.

Whichever way you choose to use it, the possibilities are only limited by your imagination.

Getting Started


Based on the powerful open source Free Pascal compiler and the Lazarus integrated development environment (IDE), Ultibo core doesn't require you to gather together components to create a development environment or wrestle with linker script syntax. Everything you need to create and compile working applications is included in the installer download so you can create your first embedded application as easily as starting a new project, adding a few lines of code and compiling.

Getting Started - A simple guide to creating your first embedded application.
Demo Image - See what Ultibo can do by installing and running the demo image, or if you don’t have a Raspberry Pi you can watch a video of the demo on YouTube.

Architecture and Design


Designed as a unikernel or kernel in a run time library (RTL), Ultibo core is much more than just a boot loader or helper library. When you compile even the simplest application the core functionality of threading, memory management, interrupt handling, clock, timers, strings, objects etc is automatically included in your program so before your code is executed a complete environment has been established to support your application. You don't need to know how it works to use Ultibo core but being open source everything is there for you to explore.

Architecture - Detailed information about the internal workings of Ultibo core.

Developing with Ultibo


With support for almost all of the Free Pascal RTL functionality including memory, strings, classes, objects, files, threads, sockets, exceptions and unicode, most of the information and examples in the Free Pascal documentation can be used directly with little or no change. Ultibo core also includes additional APIs that allow direct access to core functionality and support for specific hardware and protocols.

Unit Reference - Complete API reference for all Ultibo core units.
Environment Variables - Details of all environment variables that can be passed on the command line.

Even though the installer download provides everything needed to get started, some may want to build the Ultibo RTL, or even FPC and Lazarus from sources in order to customize the way things work. If you're a Linux or Mac user, we don't currently provide a package that includes our modifications to FPC or Lazarus however you should be able to build your own using our build instructions as a starting point.

Building from Source - How to rebuild the RTL, FPC or Lazarus from source on Windows.
Building for Debian - Building FPC and the RTL for Debian Linux.
Building for Raspbian - Building FPC and the RTL for Raspbian Linux.
Building for Mac - Building FPC and the RTL for Mac OSX.

Supporting the Ultibo Project


Like any open source project there are many ways you can help and support, it could be as simple as telling others about it, sharing your projects, contributing some code or writing some documentation. You can even choose to directly sponsor a feature if you prefer. Whatever you might decide to do, the goal is always to create an environment where everyone can experience the excitement of creating something from their own imagination. For more details see Supporting Ultibo.

Current Status


Ultibo core is a work in progress (all software is) so not every feature is supported or fully implemented yet. The support for both features and hardware will continue to grow with each release with new support added based on need. Priority is always given to developing those things that have the most benefit and on ensuring that both performance and stability are continually improved.

Current Status - The status of support for features and functionality as at the current version.
Supported Hardware - Detailed information on what hardware is currently supported and ready to use.
Bug Tracking - Information about currently known and reported bugs.

Useful Resources


In spite of the revolution in information provided by the internet, sources of information are still scattered widely and can often be difficult to find unless you search for exactly the right words. We're gathering an ever growing list of resources related to all aspects of developing embedded devices so if you find something good let us know and we'll add it so that others can benefit.

Useful Resources - A collection of the best information we can find on everything related to Ultibo core.
Video Tutorials - Our video tutorial collection including the Discovering Ultibo series.

License


Ultibo core is licensed under the GNU Lesser General Public License v2.1 and is freely available to use, modify and distribute within the terms of the license. The license includes an exception statement to permit static linking with files that are licensed under different terms.

RetroBSD: Unix for microcontrollers

$
0
0

RetroBSD is a port of 2.11BSD Unix intended for embedded systems with fixed memory mapping. The current target is Microchip PIC32 microcontroller with 128 kbytes of RAM and 512 kbytes of Flash. PIC32 processor has MIPS M4K architecture, executable data memory and flexible RAM partitioning between user and kernel modes.

  • Small resource requirements. RetroBSD needs only 128 kbytes of RAM to be up and running user applications.

  • Memory protection. Kernel memory is fully protected from user application using hardware mechanisms.

  • Open functionality. Usually, user application is fixed in Flash memory - but in case of RetroBSD, any number of applications could be placed into SD card, and run as required.

  • Real multitasking. Standard POSIX API is implemented (fork, exec, wait4 etc).
  • Development system on-board. It is possible to have C compiler in the system, and to recompile the user application (or the whole operating system) when needed.

Pregnancy Discrimination Is Rampant Inside America’s Biggest Companies

$
0
0

When she got pregnant, Otisha Woolbright asked to stop lifting heavy trays at Walmart. Her boss said she had seen Demi Moore do a flip on TV when she was nearly full-term — so being pregnant was “no excuse.” Ms. Woolbright kept lifting until she got hurt.

When she got pregnant, Rachel Mountis was winning awards for being a top saleswoman at Merck. She was laid off three weeks before giving birth.

When she got pregnant, Erin Murphy, a senior employee at the financial giant Glencore, was belittled on the trading floor. After returning from maternity leave, she was told to pump milk in a supply closet cluttered with recycling bins.

When she got pregnant, Otisha Woolbright asked to stop lifting heavy trays at Walmart. Her boss said she had seen Demi Moore do a flip on TV when she was nearly full-term — so being pregnant was “no excuse.” Ms. Woolbright kept lifting until she got hurt.

When she got pregnant, Rachel Mountis was winning awards for being a top saleswoman at Merck. She was laid off three weeks before giving birth.

When she got pregnant, Erin Murphy, a senior employee at the financial giant Glencore, was belittled on the trading floor. After returning from maternity leave, she was told to pump milk in a supply closet cluttered with recycling bins.

American companies have spent years trying to become more welcoming to women. They have rolled out generous parental leave policies, designed cushy lactation rooms and plowed millions of dollars into programs aimed at retaining mothers.

But these advances haven’t changed a simple fact: Whether women work at Walmart or on Wall Street, getting pregnant is often the moment they are knocked off the professional ladder.

Throughout the American workplace, pregnancy discrimination remains widespread. It can start as soon as a woman is showing, and it often lasts through her early years as a mother.

The New York Times reviewed thousands of pages of court and public records and interviewed dozens of women, their lawyers and government officials. A clear pattern emerged. Many of the country’s largest and most prestigious companies still systematically sideline pregnant women. They pass them over for promotions and raises. They fire them when they complain.

In physically demanding jobs — where an increasing number of women unload ships, patrol streets and hoist boxes — the discrimination can be blatant. Pregnant women risk losing their jobs when they ask to carry water bottles or take rest breaks.

In corporate office towers, the discrimination tends to be more subtle. Pregnant women and mothers are often perceived as less committed, steered away from prestigious assignments, excluded from client meetings and slighted at bonus season.

Each child chops 4 percent off a woman’s hourly wages, according to a 2014 analysis by a sociologist at the University of Massachusetts, Amherst. Men’s earnings increase by 6 percent when they become fathers, after controlling for experience, education, marital status and hours worked.

“Some women hit the maternal wall long before the glass ceiling,” said Joan C. Williams, a professor at University of California Hastings College of Law who has testified about pregnancy discrimination at regulatory hearings. “There are 20 years of lab studies that show the bias exists and that, once triggered, it’s very strong.”

Of course, plenty of women decide to step back from their careers after becoming mothers. Some want to devote themselves to parenthood. Others lack affordable child care.

But for those who want to keep working at the same level, getting pregnant and having a child often deals them an involuntary setback.

The number of pregnancy discrimination claims filed annually with the Equal Employment Opportunity Commission has been steadily rising for two decades and is hovering near an all-time high.

It’s not just the private sector. In September, a federal appeals court ruled in favor of Stephanie Hicks, who sued the Tuscaloosa, Ala., police department for pregnancy discrimination. Ms. Hicks was lactating, and her doctor told her that her bulletproof vest was too tight and risked causing a breast infection. Her superior’s solution was a vest so baggy that it left portions of her torso exposed.

Tens of thousands of women have taken legal action alleging pregnancy discrimination at companies including Walmart, Merck, AT&T, Whole Foods, 21st Century Fox, KPMG, Novartis and the law firm Morrison & Foerster. All of those companies boast on their websites about celebrating and empowering women.

As a senior woman at Glencore, the world’s largest commodity trading company, Erin Murphy is a rarity. She earns a six-figure salary plus a bonus coordinating the movement of the oil that Glencore buys and sells. Most of the traders whom she works with are men.

The few women at the company have endured a steady stream of sexist comments, according to Ms. Murphy. Her account of Glencore’s culture was verified by two employees, one of whom recently left the company. They requested anonymity because they feared retaliation.

On the company’s trading floor, men bantered about groping the Queen of England’s genitals. As Glencore was preparing to relocate from Connecticut to New York last February, the traders — including Ms. Murphy’s boss, Guy Freshwater — openly discussed how much “hot ass” there would be at the gym near the new office.

In 2013, a year after she arrived, Mr. Freshwater described Ms. Murphy in a performance review as “one of the hardest working” colleagues. In a performance review the next year, he called her a “strong leader” who is “diligent, conscientious and determined.”

But when Ms. Murphy told Mr. Freshwater she was pregnant with her first child, he told her it would “definitely plateau” her career, she said in the affidavit. In 2016, she got pregnant with her second child. One afternoon, Mr. Freshwater announced to the trading floor that the most-read article on the BBC’s website was about pregnancy altering women’s brains. Ms. Murphy, clearly showing, was the only pregnant woman there.

“It was like they assumed my brain had totally changed overnight,” Ms. Murphy, 41, said in an interview. “I was seen as having no more potential.”

When she was eight months pregnant, she discussed potential future career moves with Mr. Freshwater. According to her, Mr. Freshwater responded: “You’re old and having babies so there’s nowhere for you to go.”

A Glencore spokesman declined to comment on Mr. Freshwater’s behalf.

After she came back from four months of maternity leave, she organized her life so that having children wouldn’t interfere with her career. She arranged for child care starting at 7 a.m. so she would never be late.

But as her co-workers were promoted, her bosses passed her over and her bonuses barely rose, Ms. Murphy said.

When there was an opening to be the head of her department, Ms. Murphy said she never got a chance to apply. The job instead went to a less experienced man. Ms. Murphy said an executive involved in the selection process had previously asked repeatedly whether she had adequate child care.

Ms. Murphy said that after she missed out on another job, the same Glencore executive told her it was because of the timing of her maternity leave. Ms. Murphy has retained a lawyer and is planning to file a lawsuit against Glencore.

Glencore’s spokesman, Charles Watenphul, defended the company’s practices. “Glencore Ltd. is committed to supporting women going on and returning from maternity leave,” he said. He said Ms. Murphy was never passed over for promotions or treated differently because of her pregnancies. He said that she received bonuses and pay increases every year. Her lawyer, Mark Carey, said that Ms. Murphy was only given cost-of-living increases and was denied opportunities to advance.

Ms. Murphy’s problems are not rare. Managers often regard women who are visibly pregnant as less committed, less dependable, less authoritative and more irrational than other women.

A study conducted by Shelley Correll, a Stanford sociologist, presented hundreds of real-world hiring managers with two résumés from equally qualified women. Half of them signaled that the candidate had a child. The managers were twice as likely to call the apparently childless woman for an interview. Ms. Correll called it a “motherhood penalty.”

“There is a cultural perception that if you’re a good mother, you’re so dedicated to your children that you couldn’t possibly be that dedicated to your career,” Ms. Correll said.

We would like to hear from you.

We would like to hear from you. Were you discriminated at work because of your pregnancy? Did that negatively impact the health of you and your child, resulting in severe health consequences for you and your baby?

For more, read our investigation: Pregnancy Discrimination Is Rampant Inside America’s Biggest Companies

Your name and comments will not be published without your permission, and your contact information will not be published. A reporter or editor may follow up with you to hear more about your story.

You have 500 words left.

Thank you for your submission.

A paper published in November by researchers at the Census Bureau examined the pay of spouses. Two years before they had their first child, the husbands made only slightly more than their wives. By the time their children turned 1, the size of that pay gap had doubled to more than $25,000. Women taking maternity leave, dropping out of the work force or working fewer hours could contribute to that disparity, but it does not explain all of it, the researchers said.

Ms. Murphy still works at Glencore. In January, she filed a complaint of pregnancy discrimination with the Equal Employment Opportunity Commission. Last year, the E.E.O.C. received 3,184 pregnancy discrimination complaints, about twice as many as in 1992, when the agency began keeping electronic records. Regulators say many women never file complaints because they can’t afford an attorney, don’t recognize that what happened to them is illegal or fear retaliation.

Merck, the giant pharmaceutical company based in Kenilworth, N.J., presents itself as a champion of professional women. “We celebrate the women whose hard work and tenacity have helped us continue to invent for life,” the company’s website boasts.

That is part of the reason Rachel Mountis wanted to work there.

Within a year of joining in 2005, she was given a coveted job selling vaccines. She was promoted four years later. She won a Vice President’s Club Award for sales and a Peer Award for “outstanding leadership.” Merck paid for her to get a master’s degree in business at New York University.

Ms. Mountis knew that when she got pregnant in 2010 she would need to take several weeks off for maternity leave. That meant she wouldn’t be able to stay in constant contact with the doctors whom she had cultivated as customers — and that her absence could cost Merck business.

A few weeks before Ms. Mountis’s due date, Merck told her and a handful of colleagues that they were being laid off in a downsizing.

“On paper, I was the same professional that I was nine months earlier,” she said. Being pregnant “was the only thing that was different.”

Ms. Mountis eventually got another job at Merck, but it was a demotion with lower bonus potential.

Merck was already facing a lawsuit accusing the company of paying women less than men and denying them professional opportunities. That suit, in New Jersey federal court, was brought by Kelli Smith, a Merck saleswoman who said her career was derailed when she got pregnant. “You’re not going anywhere” at the company, a male colleague told Ms. Smith, according to the suit.

The women involved in the litigation say they were harassed by male superiors.

At a conference, a Merck executive referred to a female employee as the “hottest one in here” and asked what he could do to get her upstairs to his hotel room, according to court documents.

At another company event, the same executive referred to a group of women from a company that Merck had just acquired as “whores” and said “they are much hotter than the Merck whores.”

In 2014, Ms. Mountis joined the lawsuit, which now covers roughly 3,900 women.

A trial date has not been set. A Merck spokeswoman said the company “has a strong anti-discrimination policy.” Ms. Mountis, the spokeswoman said, “was supported throughout her career to ensure she had opportunities to advance and succeed.”

Ms. Mountis tried to make the best of her less prestigious job. Merck demoted her again in 2012, while she was on maternity leave after giving birth to her second child. The next year, Ms. Mountis resigned. She eventually took a job at Teva Pharmaceutical, which is a fraction of Merck’s size.

“I am still trying to get my momentum back,” Ms. Mountis said.

Ms. Smith also moved to a much smaller drug company.

Other drug companies have faced similar complaints. Novartis in 2010 agreed to pay $175 million to settle a class-action lawsuit in which thousands of current and former sales representatives said the company discriminated against women, including expecting mothers, in pay and promotions.

One former Novartis saleswoman, Christine Macarelli, said that her boss told her that “women who find themselves in my position — single, unmarried — should consider an abortion.” When she returned from maternity leave, she said she was told to stop trying to get a promotion “because of my unfortunate circumstances at home — being my son Anthony.”

The nation’s first law against pregnancy discrimination traces back to a 1970s case about how General Electric treated expectant mothers.

The company at the time gave paid time off to workers with disabilities, but not to pregnant women. The Supreme Court ruled in 1976 that the company’s policy wasn’t discriminatory.

Feminist leaders and unions campaigned to change the law to protect pregnant women. In 1978, Congress passed the Pregnancy Discrimination Act, which made it illegal to treat pregnant women differently from other people “similar in their ability or inability to work.”

That didn’t resolve the issue. Employers argued in court that pregnant women were most “similar” to workers injured off the job and, therefore, didn’t deserve accommodations.

Then, Peggy Young sued UPS for discrimination. She had been an early-morning driver when she got pregnant in 2006. Her doctor instructed her not to lift heavy boxes. UPS told her it couldn’t give her a light-duty job. She ended up on unpaid leave without health insurance.

At the time, UPS gave reprieves from heavy lifting to drivers injured on the job and those who were permanently disabled. Even employees who had lost their licenses after driving drunk got different assignments. Ms. Young argued that she should have gotten the same deal.

Two federal courts ruled in UPS’s favor. Ms. Young appealed to the Supreme Court. During oral arguments in 2014, Justice Ruth Bader Ginsburg challenged UPS’s lawyer to cite “a single instance of anyone who needed a lifting dispensation who didn’t get it except for pregnant people.” The UPS lawyer drew a blank.

In 2015, the court ruled 6 to 3 in Ms. Young’s favor. But the justices stopped short of establishing an outright protection for expectant mothers. They just said that if employers are accommodating big groups of other workers — people with disabilities, for example — but not pregnant women, they are probably violating the Pregnancy Discrimination Act.

Otisha Woolbright heaved 50-pound trays of chickens into industrial ovens every day at her job in the deli and bakery of a Walmart in Jacksonville, Fla.

In 2013, when she was three months pregnant, she started bleeding and went to the emergency room. She was told that she was at risk of miscarrying. She returned to Walmart with a physician’s note saying that she should avoid heavy lifting. She asked for light duty.

That’s when her boss, Teresa Blalock, said she had seen a pregnant Demi Moore do acrobatics on TV.

In an email to The Times, Ms. Moore said that a stunt double actually performed the routine.

“You would have to be extremely ignorant and inexperienced with pregnancy or just completely uncaring and insensitive to use a moment of comedic entertainment, like my appearance on David Letterman while I was eight and a half months pregnant, to pressure a pregnant woman into doing something that put her or her baby at risk,” she said.

According to Ms. Woolbright, Ms. Blalock said that if she couldn’t lift chickens, she could “walk out those doors.”

Ms. Woolbright couldn’t afford to lose her paycheck, so she kept lifting chickens.

“What choice did I have? There was no other job that was going to hire me being pregnant,” she said.

Later that month, Ms. Woolbright said, she was lifting a tray of chickens when she felt a sharp pain. Scared she was having a miscarriage, she went back to the hospital. Walmart then put her on light duty.

“We disagree that a specific request for accommodations due to pregnancy was made and that we denied that request,” a Walmart spokesman, Ragan Dickens, said. He said that “Ms. Blalock, a mother and a grandmother, was supportive of Ms. Woolbright.”

Ms. Woolbright asked about maternity leave. Three days later, she said she was called into a cramped office. She stood there sweating, seven months pregnant. “Walmart will no longer be needing your services,” a supervisor said.

Ms. Woolbright sued Walmart, the nation’s largest employer. Her suit, which is seeking class-action status, is pending.

It took Ms. Woolbright a year to land another job. Her children outgrew their clothes. She thought about swallowing enough antidepressants to kill herself. After stints at a restaurant and a van rental company, she stopped working, because she couldn’t get shifts that allowed her to take care of her children.

Walmart is the least expensive store in town, and Ms. Woolbright goes there to buy baby formula and diapers. “It’s torture,” she said.

Seven hundred miles to the north, Candis Riggins was scrubbing toilets at a Walmart in Laurel, Md., when she started to feel sick. She was five months pregnant, and the smell of the cleaning fluids nauseated her. She complained several times to a manager, who refused to permanently reassign her to another position. So she kept cleaning bathrooms, often pausing to vomit.

Doctors told her that chemicals in the cleaning products were endangering her and her unborn child.

One chilly morning on her way to work, she fainted at the bus stop.

Ms. Riggins again asked a manager for a different job. This time, Walmart let her clean the store’s doors, instead of the bathrooms. But she said the chemicals still made her ill.

She was eight months pregnant when she started regularly missing shifts. Walmart fired her, citing the absences. She now works at Target.

Mr. Dickens, the Walmart spokesman, said the company allowed her to stop working with the chemicals she complained about and occasionally let her work as a cashier or store greeter. Ms. Riggins’s lawyer, Dina Bakst, said that her client still had to spend most of her days cleaning.

In 2017, under pressure from Ms. Woolbright’s class-action lawsuit and E.E.O.C. complaints, Walmart updated its guidelines on how to accommodate pregnant women. The nationwide policy now includes a temporary, less-taxing job as a “possible” solution. It doesn’t provide a guarantee.

Rotary Speaker (Leslie) Technology White Paper

$
0
0

Lex RotaryWhen we decided to create a studio-class pedal that faithfully recreates the classic, unmistakable sound of the most sought-after rotating speaker system, we prepared to study every nuance.

Our sound design labs have been filled with those signature, swirling, three-dimensional sounds, as we painstakingly analyzed and recreated the physics and mechanics behind these systems.

The result is our rotating speaker technology found in Lex Rotary. Pete Celi, our Lead DSP Engineer and Sound Designer illustrates the research and sound design process in the White Paper below. Check it out!

Rotary Speaker Overview

Classic rotary speaker systems consist of a spinning horn for the high frequencies, and a rotating drum fed by a separate driver for the low frequencies. There are typically two motor speeds, slow and fast, which are also referred to as chorale and tremolo. These systems were originally designed for use with electric organs, but guitar players soon wanted in on the fun.

FIG 1. SCHEMATIC DIAGRAM OF LESLIE® TWIN-ROTOR SPEAKER SYSTEM

Rotary speaker systems create dimension and depth when rotating slowly, while generating controlled chaos when spinning at fast speeds. While simple vibrato or chorus effects can create a “poor man’s rotary” sound, a dedicated DSP implementation is required for an accurate reproduction of the many varied aspects responsible for this classic sound. Successful DSP implementation requires a comprehensive study of the physical acoustic phenomena that occur in these rotary speaker systems. Some of the key processes are discussed below.

Horn

The most identifiable effect that a rotary speaker system imparts is the pitch fluctuations known as the Doppler effect. This is a result of the horn’s movement relative to the listener, in the same way a siren appears to change pitch when a fire engine passes by.

FIG 2. THE DOPPLER EFFECT FOR A MOVING SOUND SOURCE

Since the speaker makes the same movement cyclically, the pitch fluctuations occur cyclically also. This is why a traditional vibrato or chorus is sometimes substituted for a rotary effect.

FIG 3. DOPPLER EFFECT OF ROTATING HORN SPEAKER

However, the Doppler effect as produced by a horn spinning inside a cabinet is much more complicated than what is produced by a simple chorus or vibrato effect. As the horn spins, the sound waves from the horn reflect off the interior surfaces of the speaker cabinet, with each of these surfaces experiencing its own Doppler effect before creating secondary reflections on to other surfaces. The sound that emanates from the cabinet to the listener (or microphone) is a complex combination of the horn’s direct sound and the many reflections.

FIG 4. DOPPLER EFFECT AND MULTIPLE REFLECTIONS INSIDE ROTARY SPEAKER CABINET

The spinning horn also produces amplitude and frequency response variations throughout its rotation. As expected, the horn’s direct signal is loudest and brightest when facing the listener, and softer and duller when facing away. These aspects also come into play in determining the nature of the many reflection signals.

Drum

The typical drum configuration is a downward-firing speaker projecting into a rotating cylinder that has a rectangular cutout. An electronic crossover circuit limits the bandwidth of the speaker such that only low frequencies are projected into the drum. As the cylinder spins and the cutout revolves, a pulsing amplitude modulation (tremolo) effect is produced for the lower frequencies. The phase of the amplitude-modulated signal also changes as the cutout moves across and to the rear of the cabinet. The resultant sound produced by the drum is hypnotic and has a “breathing” quality to it.

FIG 5. THE LOW-FREQUENCY BASS ROTOR

Miking

The classic approach to capturing the movement of sound involves a pair of mics at the top of the cabinet at the horn and a single mic at the bottom to pick up the drum. As the mics are moved closer to the cabinet, the amplitude fluctuations caused by the inverse square law effect become more pronounced and the horn signal gets a recognizable “choppy” quality at high speeds. Another result of close miking is an enhanced stereo effect that is very noticeable at slow speeds as the horn passes by one mic and then the next. As the mics are moved back, the fluctuations even out, eventually creating the sound that would be heard naturally in the room at a distance from the cabinet.

FIG 6. CLASSIC APPROACH TO LESLIE® CABINET MIKING

Motor speeds, ramping, and braking

In both the tremolo and chorale speeds, the horn spins slightly faster than the drum, so that the resultant sound is much more complex and evolving than if the two were spinning at identical speeds. Additionally, the inertia of the low-frequency drum is much greater than that of the horn, making it more resistant to changes in speed. Thus, while the horn speeds up and slows down rather quickly, the drum takes much longer to reach its speed. Changing speeds is where the “magic” of these systems is most apparent.

Some rotary systems allow for “braking”, which is when the speed of the horn and drum is reduced to zero so there is no more rotation. With the brake applied, the system is just a two-way stationary speaker system. When the brake is released and the systems starts spinning again, the full impact of the complexity of the system unfolds.

Amplifier

The original rotary systems had a tube amplifier built in to drive the speakers. Overdriving the amp creates harmonics that add a new dimension once they are set into motion through the rotating system. This sound is often referred to as the “growl” of a rotary speaker system, and it has become a signature trademark of these systems.

Strymon Rotary Algorithms

In developing the algorithms that produce these unmistakable sounds, we painstakingly analyzed and recreated the physics, mechanics, and intricate processes discussed above.

The horn signal exhibits all the chaotic yet periodic fluctuations inherent in rotary speaker cabinets. The drum signal pulses and breathes. A two-speed motor engine with braking capability controls the independent Horn and Drum processes. The speed ramp-times reflect the drum’s resistance to change and the horn’s light weight. Fast and slow speeds are independently adjustable, and trimming of the acceleration times is allowed for.

Additionally, a variable mic-distance control allows a wide range of sounds, from dramatic close up sweeping and swirling, to more mellow and calming undulations. A tube preamp drive control allows for overdriving the system to create rich harmonic content, with additional control of the Horn level to match your amp’s voicing. All of this without the need to lug around a behemoth cabinet, setting up microphones, worrying about proper microphone placement, and performing costly motor maintenance and cleaning.

*All product names used in this article are trademarks of their respective owners, which are in no way associated or affiliated with Strymon or Damage Control.


List of Chinese goods hit by the new 25% tariff [pdf]

CopperheadOS update: Developer suspended from Reddit

$
0
0

Important update on the CopperheadOS situation: James Donaldson (registered here as darknetj), the CEO of Copperhead who has had his shady business dealings leaked, is actively trying to silence the developer's (u/strncat) voice, so there cannot be communication between the developer and customers and to delete or mask the information released about himself and his business practices.

He pressured Reddit into deleting the developer's Reddit account (u/strncat) so far. He may try to seize control of r/CopperheadOS next. More information here, for the time being at least: Link

Twenty-Five Moves Suffice for Rubik's Cube

$
0
0

(Submitted on 24 Mar 2008)

Abstract: How many moves does it take to solve Rubik's Cube? Positions are known that require 20 moves, and it has already been shown that there are no positions that require 27 or more moves; this is a surprisingly large gap. This paper describes a program that is able to find solutions of length 20 or less at a rate of more than 16 million positions a second. We use this program, along with some new ideas and incremental improvements in other techniques, to show that there is no position that requires 26 moves.
From: Tomas Rokicki [view email]
[v1] Mon, 24 Mar 2008 19:37:09 GMT (15kb)

Crystal 0.25

$
0
0

Crystal 0.25.0 has been released!

As every release, it includes numerous bugfixes, cool features and performance improvements - in 400 commits since 0.24.2 from 47 contributors. There needs to be a special mention to @MakeNowJust, @straight-shoota, @Sija and @bew for their hard work in this release.

There were a ton of contributions merged in master even before 0.24.2 was released. But since 0.24.2 was already changing the release packaging for linux, changing the CI and fixing 0.24.1, some features needed to wait their turn a little longer.

Once again, we have tested this release by compiling some of the most popular crystal shards. This helps us catch and fix unintended breaking changes earlier in the release cycle as well as submitting PRs to the shards and contributing a bit more with the community. This process is codified using the scripts in the test-ecosystem repository, which is still fairly new, but so far it’s working well.

The least visible work usually goes in infrastructure and there are always improvements and things waiting to be done. The latest news regarding this area are:

  • Docs in master are back. For every PR that is merged the docs at HEAD can be found at /api/master/.
  • Improved SEO by adding a canonical url for online docs #5990.
  • Also on docs, lots of improvements regarding navigation have been done in #5229.
  • The automated release process now cares about 32 bits linux releases. As a bonus point the packaging has been aligned again with respect to the 64 bits packages. So some paths have changed.
  • We’ve been contacted by Heroku to early register our buildpack. Stay tuned to future Heroku news to update to the crystal-lang/crystal buildpack in the registry. All in all it’s one more taste of the adoption of Crystal out there, and we are thrilled.

Nightly packages in nightly.crystal-lang.org are still down. The workaround for now it to use the docker image crystallang/crystal:nightly.

Shards is updated to 0.8.0

There are some performance improvements in shards for this release, by downloading less information when possible. A new global cache was added, so you don’t need to download your favorite shards over and over on all of your favorites projects. FYI you can use shards 0.8.0 with Crystal 0.24.2 if you want.

Read more here.

Automatic casts for literal values

If a method is defined as def foo(x : Int8) or def bar(color : Color) with

enumColorRedGreenBlueend

up to 0.24 you would need to call them as foo(1i8) or bar(Color::Blue). But since 0.25.0 you will be able to foo(1) and bar(:blue). A note of caution: this only work with literal values. If the value is saved in a variable and used as an argument it won’t work.

This feature allows cleaner code without sacrificing safety. Read more at #6074.

User defined annotations and [JSON|YAML]::Serializable

This new language construct allows the user to define their own annotations, like [Link]. Basically you will be able to annotate types declaration or instance variables, and later on query them to do something you wish in macros.

Before this feature metaprogramming usually involved calling one macro with all the information needed. From now on, a more decoupled mechanism between declaring and consuming can be used. Read more at #6063.

The new JSON::Serializable and YAML::Serializable modules use this annotations. Feedback is welcome since this feature is brand new. You can read more at #6082, JSON::Serializable, YAML::Serializable docs.

Another usage of annotations might be to declare a registry of classes, like the one used in DB drivers or frameworks handlers. And it could enable the removal of mutating values of constants during compilation time in favor of a more declarative code.

Do not collapse unions for sibling types

Code is worth a thousand words (you know, like pictures):

classFooendclassBar<FooendclassBaz<Fooendvar=rand<0.5?Bar.new:Baz.newtypeof(var)#=> Bar | Baz

Up to 0.24.2 the result was typeof(var) #=> Foo.

Although the previous code already compiled fine in 0.24.2 this changes allow the type system to deal with some cases that would have ended in a compile-time error but that actually make sense. At the end of the day the type system is about identifying which programs will safely run and cutting the ones that won’t.

The following program is an example of that. It won’t compile in 0.24.2 but it now does in 0.25.0.

classFooendclassBar<Foodefdo_itendendclassBaz<Foodefdo_itendendclassQux<Foo# there is no do_itendvar=rand<0.5?Bar.new:Baz.newvar.do_it

This is particularly useful in scenarios where there is a huge hierarchy of types but in a section of the code only a subset is used.

You can read more at #6024 and discover when the union of types are still collapsed to the common ancestor (spoiler, they need to not be siblings).

JSON::Any and YAML::Any changes

There were some subtle inconsistencies with JSON::Any and YAML::Any API. The bottom line is that over an ::Any value you can use #[] to traverse it and it will always return an ::Any value. If you need a specific type for the ::Any value (and be able to use Enumerable methods if it was an array) you need to call the already known #as_a, #as_h, #as_s methods.

We still encourage, when possible, the use of JSON.mapping, JSON::Serializable or JSON::PullParser when finer control is needed.

Read more at #5183 and in the JSON::Any and YAML::Any docs.

HTTP::Server can bind to multiple addresses

This will break lots of presentations and even the code shown in our own homepage but the benefits are great.

From now on if you use the built-in HTTP::Server you first need to configure it, then bind to one or more addresses, and lastly you listen to all of them. These addresses can be TCP ports or Unix sockets.

require"http/server"server=HTTP::Server.newdo|context|context.response.content_type="text/plain"context.response.print"Hello world, got #{context.request.path}!"endserver.bind"0.0.0.0",8080server.bind_unix"/tmp/app.sock"server.listen

There is still a shortcut to bind and listen, but it doesn’t avoid a breaking change. Read more at #5776, #5959, and the HTTP::Server docs

Welcome to the TimeZone Jungle

There was a huge refactor in Time. If you hit a unicorn while opening the PR to read more about it, just try again.

Starting now Time has #location and #offset properties to know the timezone exactly. Time.now and Time.new will return by default information in the local timezone, while Time.utc_now and Time.utc will return information in UTC.

Methods like #to_local, #to_utc, #utc?, #local? and #in(location : Location) will help you to move around the globe faster than a plane.

The API even allows you to use custom timezones and fixed offsets with Time::Location.fixed.

Another change in the Time namespace are formatters. Better formatters for ISO 8601, RFC 3339, RFC 2822, HTTP enconded dates, YAML and other places where time was parsed or emitted now use a custom time formatter that deals with more cases as expected in each scenario.

Read more at #5324 and #5123 and Time, and Format docs.

Replace File::Stat with File::Info and other file API changes

Some time ago an abstraction for the running OS was introduced in the stdlib. The goal was to be able to run the Crystal compiler in a non POSIX platform and keep the stdlib as clean as possible. Feel free to check src/crystal/system, but keep in mind it is not intended as a public API.

This also required to pick names and abstractions in the stdlib that will fit everybody: POSIX and non POSIX.

The API was renamed and reworked for compare operations and accessing file properties and permissions. It is much clearer now. Hopefully it doesn’t affect too many users, since most of us use File.open, File.write and move on. Read more at #5584, #5333, #5553, #6161, File and File::Info docs.

Heredoc on every argument

If you use Heredoc a lot of you might be interested in this one. Up to 0.24.2 if you wanted to call a method on a string specified using Heredoc you would do:

puts<<-FOO
  message
  FOO.downcase

From now on the method needs to be at the initial delimiter

puts<<-FOO.downcasemessageFOO

It’s subtle but important, and it plays better with multiple Heredocs in a single call now that you can:

puts<<-ONE.upcase,<<-TWO.capitalizehelloworldONEsecondmessageTWO

Read more at #5578.

Macro verbatim blocks

If you deal with escaped macros don’t miss #6108.

Macros are powerful and they should be used after there is a boilerplate pattern discovered.

This new language construct helps when the macro itself will define, for example, methods that have macro blocks that should be expanded later (i.e. nested macros).

It may result in a nicer way to express the same things you could before with some \{% escaping %}.

  • crystal deps is dead, long live shards install. #5544. Unless we removed it, you would never have updated your build scripts.
  • Use Hash#key_for to perform a reverse lookup in a hash #5444 #NamesAreHard.
  • The block argument from loop was removed #6026.
  • Fix File.join with empty path component #5915.
  • Colorize#push is dead, long live Colorize#surround#4196. Bonus point, your #to_s can use your favorite color now.
  • Punycode is a special encoding used to convert Unicode characters to ASCII and is used to encode internationalized domain names (IDN). And now they are available in Crystal thanks to #2543.
  • pp no longer prints the expression. But pp! and the new p! will. p stands for print, pp for pretty print and ! for show me the ~~money~~ expression #6044.

Please update your Crystal and report any issues. If there are regression or blocking issues with 0.25.0, a 0.25.1 could be released earlier.

Don’t miss the rest of the release changelog information with lots of other fixes.

The development is possible thanks to the community’s effort, 84codes’ support, and every BountySource supporter.

Why Social Science Needs Evolutionary Theory

$
0
0

The lack of willingness to view human cognition and behavior as within the purview of evolutionary processes has prevented evolution from being fully integrated into the social science curriculum.Photograph by David Carillet / Shutterstock

My high school biology teacher, Mr. Whittington, put a framed picture of a primate ancestor in the front of his classroom—a place of reverence. In a deeply religious and conservative community in rural America, this was a radical act. Evolution, among the most well-supported scientific theories in human history, was then, and still is, deliberately censored from biological science education. But Whittington taught evolution unapologetically, as “the single best idea anybody ever had,” as the philosopher Dan Dennett described it. Whittington saw me looking at the primate in wonder one day and said, “Cristine, look at its hands. Now look at your hands. This is what common descent looks like.”

Evolution has shaped the human body, but it also shaped the human brain, so evolutionary principles are indispensable for understanding our psychology. Yet many students, teachers, and even social scientists struggle to see how our evolutionary history significantly shapes our cognition and behavior today. “Learning” and “culture” do not explain behavior so completely that turning to ideas from evolution is unnecessary. The lack of willingness to view human cognition and behavior as within the purview of evolutionary processes has prevented evolution from being fully integrated into the social science curriculum.

A deeper scientific understanding leads to the view that learning doesn’t compete with evolution as an explanation for human psychology. Learning requires evolved psychological adaptations—general learning mechanisms or mechanisms which may be specific to a particular adaptive problem. Specialized learning mechanisms help us avoid eating toxic food, yet no one is born knowing which particular foods to avoid. Humans have also evolved an aversion to mating with their genetic kin but are not born knowing who their kin are. Solving these adaptive challenges requires evolved psychological learning mechanisms.

Human cognition and behavior is the product of the interaction of genetic and cultural evolution. Gene-culture co-evolution has allowed us to adapt to highly diverse ecologies and to produce cultural adaptations and innovations. It has also produced extraordinary cultural diversity. In fact, cultural variability is one of our species’ most distinctive features. Humans display a wider repertoire of behaviors that vary more within and across groups than any other animal. Social learning enables cultural transmission, so the psychological mechanisms supporting it should be universal. These psychological mechanisms must also be highly responsive to diverse developmental contexts and cultural ecologies.

Take the conformity bias. It is a universal proclivity of all human psychology—even very young children imitate the behavior of others and conform to group norms. Yet beliefs about conformity vary substantially between populations. Adults in some populations are more likely to associate conformity with children’s intelligence, whereas others view creative non-conformity as linked with intelligence. Psychological adaptations for social learning, such as conformity bias, develop in complex and diverse cultural ecologies that work in tandem to shape the human mind and generate cultural variation.

Truly satisfying explanations of human behavior requires identifying the components of human cognition that evolution designed to be sensitive to social or ecological conditions and information. For example, populations in which food resources show high variance (large game hunting is very much hit or miss) tend to evoke cooperative adaptations for group-wide sharing compared to those in which food variance is lower and more dependent on individual effort, like gathered foods. Recent discoveries in the field of cultural evolution have demonstrated that our technological complexity is the outcome of our species’ capacity for cumulative culture. It has set our genus Homo on an evolutionary pathway remarkably distinct from the one traversed by any other species. In a paper last year, I proposed that this was a result of psychological adaptations being universal but sufficiently flexible for innovations to build on each other, supporting the acquisition of highly variable behavioral repertoires.

Applying evolutionary theory to social science has the potential to transform education and, through it, society. For example, evolutionary perspectives can help social scientists understand, and eventually address, common social problems. Schoolyard bullying provides one example. Without an evolutionary understanding of the phenomenon, interventions are likely to be ineffective, since they misdiagnose the causes of bullying. Bullying is not merely negative interpersonal behavior; it’s goal-oriented and serves the social function of gaining status and prestige for the bully, which must be understood to combat it. For example, bullying often occurs in front of an audience, suggesting that social attention drives, and may reinforce, the behavior. A 2015 paper suggests most interventions don’t work because they remove the rewards of bullying—increased social status—without offering any alternatives. The researchers recommend that the esteem bullies seek “should be borne in mind when engineering interventions” designed to either decrease a bully’s social status or channel the bully’s social motivations to better ends. A deep understanding of the evolved functions of bullying, in short, provides a fulcrum for potential remedies.

If “nothing in biology makes sense except in the light of evolution,” as the evolutionary biologist Theodosius Dobzhansky argued in 1973, then nothing in human psychology, behavior, and culture does either. Social scientific research should reflect this fact.

Cristine Legare is an associate professor of psychology and the director of the Evolution, Variation, and Ontogeny of Learning Laboratory at The University of Texas at Austin. Her research examines how the human cognitive system enables us to learn, create, and transmit culture. She conducts comparisons across age, culture, and species to address fundamental questions about cognitive and cultural evolution. Follow her on Twitter @CristineLegare.

Get the Nautilus newsletter

The newest and most popular articles delivered right to your inbox!

WATCH: The evolutionary purpose of having a teenage brain.

Viewing all 25817 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>