Fast links: Interglossa » Glosa »

Re: [glosalist] Re: minimal vocabulary

William T. Branch ("William T. Branch" <bill@...>) on April 11, 2006

Hello Robin,

As usual, your last message was a pleasure to read.

At 12:12 AM 4/10/06, Bill Branch replied to Kevin:

Hello Kevin,

After staying up all night looking at the three languages you referred to, I have a greater appreciation for what you’ve been advocating. It seems that Tavo and Glo both fit your design requirements. I don’t understand why you say you’ve abandoned them. *** The abandonment of a Planned Language is easily explained: Esperanto has almost made it to “accepted” status after a century; Glosa has been around for a third of a century, and is still not ‘completed’; any new player has to appeal to people with a range of linguistic interests, and then has to try to launch through a very thick ‘glass ceiling’.

I think Glosa still has a chance if it finds the right “killer app” or combination thereof. Esperanto has been so slow to grow over the last century I think, because it is not as easy to learn as it claims to be. Their claim that Ido and other movements have slowed their progress dramatically just doesn’t hold water with me. I suspect the reason has more to do with economics.

*** Decisions, decisions! As Kevin shows, he has sought to solve the problems by working alone - or now with Lingua Franca Nova; however, were those who are really interested in the improvement of communication around Planet Earth to work co-operatively, a sub-group working in the area of Syntax-based Language would eventually agree on a ‘best product’. This could include the most suitable features of languages from the un-inflected group of Planned Languages. Whether such a level of agreement is achievable with the egos, personalities and philosophical associations involved … is another matter.

I think it’s possible if not messy. Sustained interest is the most important ingredient I think. I think often there may be clever ways to pull seemingly contradictory ideas into the same language.

   After that, of course, we can imagine a play-off between the extremely polarised contenders, with the highly inflected Esperanto in one corner, and the best product of the un-inflected group in the other.

   Must there be ultimately only the ONE International Auxilliary Language, or can the human race cope with two IALs - to cater for the great divergence of psychological types in the world?  I do suspect that the reason for such vehement dispute over possible world languages is a matter of the balance between right brain and left brain dominance within mankind: left-brainers have good memories and really like complex inflections; right-brainers prefer to let the grey matter do it for them intuitively, and would rather the apparent simplicity of allowing hard-wired syntax to produce their sentences for them from discrete concepts.  While that last sentence was getting too long, it does suggest that Ron Clark and I come from the Creative camp; and, on the other hand, inflectors - with good memories - are drawn to Esperanto, because they have an intelligent/creative balance tipped in the direction of Intelligence, and can easily cope with the mental loading of an inflectional schema.

Interesting theory. I wonder what linguists will discover about this someday. I always thought the syntax languages were more left brained but your explanation sounds logical.

To summarize: I can’t wholeheartedly support Glosa because:

  1. I am aware of perhaps a dozen possible improvements to the language which apparently have no hope of being debated and perhaps eventually accepted into the official language.[1] *** Actually, this Mailing List ought to be the debating board for Glosa.

I was unsure about this myself.

   The trouble seems to be that people, including me, become attached to their ideas, and cannot treat the matter objectively.  My difference from neophyte Glosa-pe is that I have seen a number of failed ploys attempted, and dead ends entered, and retreated from - in the process of defining what Glosa is to-day.  I have also seen a number of ideas promoted, ones which could, or should, have been persued, in the development of Glosa, but were not.
   There has been no serious discussion, with worked examples, of the actual grammatical system at work within Glosa; the possibility of further, hyphenated inflection-like affixes, like the eighteen 'category endings' (-do, -pe, etc) has not been explored; no centralised book exchange has been set up, to which authors and translators may send their new Glosa works; the actual mechanics of what constitutes good (or natural, or hard-wired) syntax has not, to my knowledge, been explored; a set of protocols for the addition of new words to the Glosa lexicon has not been developed; and there is no sub-office for the development for Glosa of the wide range of technical vocabularies that make up our complex civilisation.

I was also considering some kind of book exchange between Glosa-pe. I’m not sure it’s the same thing you’re referring to however.

  1. I think that a reader should be able to memorize, or have printed, about 500-1000 words of an IAL, and be able to read almost any text that claims to be written in that IAL. The main exceptions would be technical words in a field. Splitting the language into “core” and “full” does not help, unless the non-core words are VERY rarely used.

*** This is a left-brain type statement. Some people love learning new words, and often, the more the merrier. If there is a ‘core’ and ‘advanced’ divide, at what number of words is the division made? If, on the other hand, there is a suitable “Learner Vocabulary” and there are progressively more wordy learner books written, then is a hard-lined division necessary? I believe that it is between the standard, global Glosa and the ‘technical lexicons’ that the division exists. If I write a treatise in Glosa on nuclear energy, I will use the Global lexicon, and I will also use vocabulary from the General Science and Nuclear lexicons. Needless to say such lexicons do not yet exist, and are purely constructs of my mind.

I suspect there is a good dividing line as well. I devised a thought experiment to address this after reading Kevin’s site on Glo.

Using a small vocabulary can be burdensome for the author, but as Kevin stated, that’s OK because most IAL users read more then they write. However, at some point as you shrink the lexicon, it also gets burdensome to read as well because trying to understand common ideas
conveyed in the language becomes a constant game of twenty questions. Where this magic point is should be the point of some research for any interested IAL developer. It may be as low as Kevin suggests at 500. Maybe less maybe more.

Kevin mentions how you would address a situation where you’re talking about an elephant. You would simply say “very large grey animal with long tubular projection from face”. (These are my words as I don’t have his website opened.) If you’re going to talk about elephants often, you would say after the first description to convey what you’re talking about, “I will from here on refer to these as tube-faces”

This is where my thought experiment starts. Let’s imagine that a perfectly wonderful language is developed that can really be learned in a matter of a week or so because the vocabulary is at 500. Anything written in it can be understood by anyone who knows these 500 words. After much testing during the design of the language, 500 was the magic number where the 20 question game wasn’t over taxing to the reader.

Every work written would average a number of grey elephant situations per page, whether a fraction or number greater than one. This happens anyway with works of English pushing new territory. (notice my use of
the phrase “grey elephant situation” as an example) Because a lot of these concepts that must be defined up front are relatively common, they will also show up in several other works. All works that mention elephants will face a similar up-front description and a simple reference word for subsequent referencing.

Authors may be tempted upon seeing several previous works regarding elephants just to use one of the references without a definition. But this would be using a word not in the vocabulary. For the IAL to keep its integrity, the authors MUST always pre-define all references in their own work because they can never assume the reader has read anything else.

No real problem so far. This happens in all natural languages anyway. All works speaking of concepts that are likely not to be in the readers lexicon, the author should be aware enough to define the words that go with those concepts.

Being that several different authors would need to define elephants in various works, it is likely, even inevitable, that a different compound like word would be developed for each work, such as: “tube-nose” “floppy-ear-giant” “thunder-snout” etc. The author and the reader both know that these words go out of scope at the end of the work.

I imagine a novel written in a language of 500 words might end up with a vocabulary of compound references exceeding the languages vocabulary. This does represent a tax on the readers memory. A good author would have ways to minimize this for long works with many references.

One way is to space definitions so they don’t clump together too tightly. Another would be to keep using the long description for awhile with the corresponding word until the author feels sure the reader will remember.

Regardless of the techniques the author uses to minimize memory strain, there will always be some.

A logical step for authors then, in the search for minimizing strain, is for there to be a standard word list for concepts that regularly pop up. This does not alleviate the need to pre-define all words in every work however, because of the possibility that a reader, especially a new one, will not be familiar with this de-facto word list or previous works with the word. It does however make the memory strain of regular readers approach zero rather then a constant level of acceptable difficulty for everybody regardless of how experienced they are.

What this thought experiment shows is that an IAL with a small vocabulary - such as Glosa, but especially Glo and Tavo - SHOULD have both a core and extended vocabulary. The core should be all any reader should have to memorize up front to read any text written in the language meant for auxiliary purposes. Obviously, those writing for themselves or other writers may use the whole language with no pre-defining of all extended words.

This is why I think the decision in Glosa to have a core and extended vocabulary was a good insight. The real questions are, what is the ideal size of the core and what words should go in it?

We know the core must be as small as possible to get people to fluently read the language ASAP, while not being so small that people are constantly solving word riddles to figure out what the author is talking about.

I, like Robin, don’t mind synonyms. There are a couple of definitions for Rabbit in Glosa as well as several others. I think many creative minds would not be attracted to a language without a large lexicon for expressiveness. I wouldn’t mind if Glosa had 80,000 words as long as the core was minimal.

I hope my argument shows that it does not have to be a question between IAL and expressiveness. It’s really about writing in the “IAL style”. That is; use a small core that everyone must know and define all words that are used outside the core within the work.

-bill

Fast links: Interglossa » Glosa »

Re: [glosalist] Re: minimal vocabulary - Committee on language planning, FIAS. Coordination: Vergara & Hardy, PhDs.