New Model Activist Site for Just in Time
Research
New Model Push Notes
Information Originates and Interrelates
As I started leading into my New
Model ideas for the activist community design, I took a close look at
embryonic ideas of bloggists about linking concepts. I simultaneously
looked at a little yellow notebook I use for jotting ideas. I
call the notebook my push notes, where the idea is to push ideas out
of the mind and into the Information Society as quickly as possible:
thought pushing.
Little scraps of knowledge-building appear
first in notebooks (the computer being too bulky for creativity), then
get introduced to the Information Society as a small entry. The
entry is modified as the author extends the idea, and finds supporting
information. While existing ideas contribute to the original inception,
the inception, in reality, springs from the thin air of the author's
inspiration.
A idea pushed to a scrap of paper evolves
into a expressed idea, collections of pushed ideas can create aggregates
of ideas forming well documented text. As concepts evolve, and
the scale upwards into the expanding Information Society, meta-joins
of documented concepts can create high level and cohesive virtual repositories
of information attributable to no single source. Still recognizable
within the joined documents are significant contributors, and the mapping
of constructed information can show the evolution of ideas giving a
foundation of supporting ideas for the further development of ideas.
This describes truly inclusive knowledge building in the Information
Society.
Ideas are not spawned by ideas; a person
has to form an idea and express it as a concept for an idea to exist,
to be recognized, and to become part of constructed knowledge.
A leading newly inspired idea combined with its contributing and supporting
knowledge gives most, if not all, of the information necessary to create
linking constructs useful in attaching this new idea with documents
built of similar ideas.
"I have this thought: this is where
the thought came from, and these sources contributed to the thought,
or support it" --thus the author reflects the thought he just developed;
he creates from the reflection linking information so he can allow it
attach to other similar ideas.
By adding tags (really keywords) to his
knowledge constructs that reflect the ideas within the construct, the
author can allow the ideas in his documents to mix with concurrent documents
from other authors, or even himself, usually in his immediate environment.
Such is linking in a localized information soup such as a web community.
Since no two documents of original ideas
will reflect the same impression with which to create linking mechanisms,
similar documents, possibly closely linked, will offer differing profiles
of linking information.
As documents with similar ideas link
to each other they can form a cluster. They can pull to themselves,
as a cluster, other text that contains more diverse ideas because of
the differences between them. This ongoing linking process can
create expanded clusters of ideas that blend, in a multi-dimensional
medium, to other clusters likewise built of closely linked ideas, just
as colors blend in a spectrum.
Thus form floating clusters of meta-information
constructs, and with them, newly freed and joined knowledge-building
with no physical limitations.
These floating clusters have to originate
from somewhere. They may develop in a discussion environment,
for instance, where each idea is proposed in the form of a response
to some other idea. Here, the discussion environment provides
a scaffolded construction area based on commonly accepted ideas of thought
development. The resulting ideas developed in the scaffolded environment
can separate from the scaffolding to join conceptual idea clusters,
with the benefit of highly reflective linking constructs. The
ideas can float away from the scaffolding of the discussion forum into
an entirely different, nebulous architecture of gravitationally attracted
idea clusters. This architecture is multi-dimensional; it is more
like a cytoplasm than a discussion environment or a book shelf; it is
more like a mass of floating dandelion seeds than a ship's dry dock
that constructed it.
The entry point for a person into the
atmosphere in which all the clustered ideas float--the cytoplasm of
the Information Society--is really a guiding overlay for the joined
concepts: it is a narration. Ideally, a narration successfully
joins a sequence of short stories, the clustered concepts, produced
as teleplays where the narrator guides a listening person through the
many knowledge constructs of linked concepts and ideas.
Listeners join into the Information Society
to become contributors as they find areas in which they are comfortable.
They then can build knowledge from knowledge. Meditating on concepts,
developing inspiration, pushing their ideas, they link their ideas to
supporting constructs. As listeners become increasingly recognizable
as contributors, their own constructs, complete with open-ended links
built from a reflection of their ideas, allow allowing for further connecting
ideas to attach to their information; new ideas are drawn to their information,
supporting or, ideally extending their ideas. Their newly developed
ideas become a foundation for ideas in the purely fluid soup of the
Information Society.
The clustering of information, the quality
of the supporting information, and the comparison of information allowed
by clustering of concepts will build knowledge that comes to the real
needs of the world, knowledge useful for the the true aims of activism.
Links and tags can be created externally
for a developed document as the document comes to rest somewhere in
the cytoplasm of the Information Society, typically as a blog entry
or in a forum discussion thread. Information gleaned from the
document, such as the originating information the ideas were built on;
the physical source it was derived from; the author; his references;
and an aggregate of descriptive words chosen from the text can create
the basis of linking profile. Sophisticated linking constructs built
from these clues can be used by server-based algorithms to make meaningful
connections joining differing information sources, creating new and
previously unimagined idea relationships leading to potentially valuable
discussion and conceptual alliances.
Algorithms have been developed to find
key-word matches in the document to create links to other information
sources. But, these algorithms presently cannot recognize the
distilled ideas behind the developed concept; they are unable to aggregate
the links into meaningful joinings of documents. Corporations
are uninterested in aggregating information concepts; commercial marketing
only justifies the flow of separate unrelated and often inaccurate information
presented with no means for comparison, nor meaningful dialog.
Corporate developed information links occupy the Information Society,
but provide no useful contribution.
These after-the-fact links are formed
externally by algorithms, hence they controlled by the algorithms, rather
than allowing the author of the ideas to create a meaningful reflection
of this thoughts with which to encourage idea-linking. External,
after-the-fact, linking of concepts by corporations can be ruthless:
algorithms created by the Google search engine, for instance, pull thoughts
and insights towards commercial products, often irrelevantly.
They are unsophisticated and inaccurate because they cannot be aware
of (nor would they care about) the author's original inspiration behind
his thoughts.
As original ideas expand further outwards
into the cytoplasm of the Information Society, the technology needs
to focus inward towards the inspiration process. As people increasingly
join as contributors, after having been listeners, their ideas become
increasingly sophisticated. As they increase the volume of their
information, they improve its quality. As both the quality and
the volume increase, more time is necessarily spent in the management
of their information constructs. Hence, a pressing need for algorithms
at the personal and community levels. Linking algorithms now need
to be distributed to all the Information Society contributors, just
as the original networking services technology was transferred from
the monopolistic corporations to the world's people.
"I have a thought, this is where
the thought came from, these sources contributed to the thought, or
support it" --this is the author controlled linking.
By adding tags (really keywords) the
author can allow the his ideas in this document to mix with concurrent
documents, probably from other authors, in the immediate environment:
the soup.
Since no two documents reflecting original
ideas will have the exact same profile of ideas, similar closely linked
documents will have differing linking information; they will pull to
them diverse linking documents, creating clusters of ideas that blend,
in multi-dimensional space, to other clusters of ideas, just as colors
blend in a spectrum.
These floating clusters, meta-information
constructs, have to originate from somewhere. They may develop
in a discussion environment, for instance, where each idea is proposed
in the form of a response to some other idea. Here, the discussion
environment provides a scaffolded construction area based on commonly
accepted ideas thought development. The resulting ideas developed
here in the scaffolded environment can separate from the scaffolded
environment to join conceptual idea clusters, with the benefit of sophisticated
linking constructs. The ideas can float away from the scaffolding
of the discussion forum into an entirely different, nebulous architecture.
This architecture is multi-dimensional; it is more like a cytoplasm
than a discussion environment or a book shelf; it is more like a mass
of floating dandelion seeds than a ship's dry dock.
The entry point into the atmosphere in
which all the clustered knowledge floats--really a cytoplasm of the
Information Society--is really a guiding overlay for the joined concepts:
it is a narration. Ideally, a narration successfully joins a sequence
of short stories, the clustered concepts, produced as teleplays where
the narrator guides the listeners through the many knowledge constructs
of linked concepts and ideas. Listeners join contributors as they
find areas in which they are comfortable. They then can build
knowledge from knowledge: meditating on concepts, creating inspiration
from their pushed ideas, linking their constructs with supporting ideas.
As listeners become contributors, they enter their ideas into the cytoplasm
of information complete with open-ended links deliberately allowing
for further connecting ideas to attach to their information, and ideas
to be created extending theirs.
The clustering of information, the quality
of the supporting information, and the comparison of information allowed
by clustering of concepts will build knowledge that comes to the real
needs of the world, knowledge useful for the the true aims of activism.
Links and tags can be created externally,
after the developed thought, the concept, comes to rest somewhere in
the cytoplasm of the Information Society, typically as a blog entry
or in a forum thread. Indicators useful for linking, such as the
discussion that the document originally built on; the source, the author;
his references; and an aggregate of descriptive words chosen in the
text can create the basis of linking information. Sophisticated linking
constructs built from these clues can be used to make meaningful
connections between information sources, creating new and previously
unimagined connections leading to potentially valuable discussion and
conceptual alliances.
Algorithms have been developed to find
key-word matches in the document to create links to other information
sources, but these algorithms presently cannot recognize the distilled
ideas behind the developed concept. They are unable to aggregate
the links into meaningful joinings of documents. Corporations
are uninterested in aggregating information concepts; commercial marketing
supports the flow of separate unrelated and often inaccurate information
constructs presented with no means for comparison, nor meaningful dialog.
These after-the-fact links are formed
externally by algorithms, hence they controlled by the algorithms, rather
than the author of the ideas. These algorithms search for key-words
to create links, or lists of tags. External, after-the-fact, linking
can be ruthless: algorithms created by the Google search engine, for
instance, pull thoughts and insights towards commercial products, often
irrelevantly. They are unsophisticated and inaccurate because
they cannot be aware (nor would they care) of the author's original
intent, or the inspiration behind the thought.
Scaffolding: The Web Community
Components of the New Model Design
Similar to action research, but developed
independently, the model I developed for accessing government, for allowing
democratic participation through the web, to directly and instantaneously
benefit those in extreme need, is a cycle for knowledge discovery, development
and implementation. The process is not a cycle; a single instrument
of knowledge is delivered to, in the case of the US, congressional leaders,
who can use the well developed knowledge of the community to make decisions
and implement beneficial laws.
In my personal experience, and the experiences
of the Katrina group, good information can provide moral guidance for
timely action. The success of the Katrina group, and other successes
I have had, are surprising; congress has historically been a culture
of cynicism. Throughout American history, the purely self-interested
motives of discrimination and exploitation have seemingly dominated
the top levels of government action; in the case of the domination of
the Native tribes, US mistreatment could be the worst in history.
Of course, the natural and humane side of American culture developed
in parallel resulting in what Humanists call emerging society: people
following their own guidance based on their own experiences independent
of framed information usually coming from either the corporate controlled
government or the religions pulpit. Most, if not all, of the members
of the Katrina activist group are emerging people.
As natural people and animal lovers many
of the members have likely been hippies sometime in their lives, or
may be so now. There is an anarchistic streak as well in most
of the members, where conceptual departures were occasionally made from
what I would consider accepted science. Looking at all the components
of the Katrina experience, both successes and difficulties, the structures
provided by the discussion threads and the group accounted for its success:
no individual effort could have influenced the misdirection of government
that New Orleans experienced. But, as a group, the discussion
forum was able to deliver accurate information, which in turn influenced
congress to create law instantly, often informally. The most remarkable
success were with respect to animal welfare; the usually unstoppable
forces of animal killers were halted instantly. The group empowered
and coordinated highly individualistic people purely through knowledge
construction; political direct action was usually implemented outside
the group. The information discovery and development process was
the scaffolding structure, for the most part. Guidance was provided
by me to focus the information development effort, and I tried to protect
the group and its treasure of information, but neither of these efforts
brought the group together nor motivated its members. The desires
by group members to benefit the people and animals of New Orleans, structured
by the information process itself, provided the successful impetus.
Educational and information technology
efforts often work in cycles, where action research is a prime example.
My new model design is similar to action research in the ways it seeks
and develops information, but the purpose of the new model is the timely
development of information instruments for quick delivery, usually to
other activists or to government. Rather than ruminating, as action
research does, the new model moves on to other areas of distress to
repeat the information discovery, development, and delivery processes.
<TEXT>
Central to any information system or
process is the typing of text into a screen, where a button sends the
newly formed information into the Information Society. Little
input to the Information Society is done through the electronic version
of paper composition, the modern word processor. The text area
is limited, providing only letters, numbers, spaces and and new-lines.
Recently, simplistic word processors have been developed that can be
embedded into a web page. The area of development that has created
these is called AJAX, where the J is for JavaScript, the programming
language of web pages. (HTML, the markup language of the web,
is technically not a programming language, it only guides the formation
of web pages in the browser.)
These new micro word processors usually
save their information as HTML code to be embedded into the web pages
of the information system that provides them. They provide the
majority of necessary word processing functions as well as access to
spell-checking facilities. They are an important development for
community systems, because they provide basic tools for the most important
component of the Information Society, the authoring effort.
Another important contribution they make, is to provide community members
with a sophisticated service, leading to an important purpose of a web
community, member support.
In the new model design, the advanced
text-area works as a tool for the member just as paper and pen work
for the traditional author. Information created in the advanced
text-area can be channeled to a variety of places simultaneously.
The most important destination for the discussion thread, but the information
stands on its own as well. It gets saved as a draft as stand-alone
information. The information is not saved so much as contained;
the container includes areas for attributes about the information, specifically
the kind of information that can be used for linking it to other information.
The container is a complex structure, an object, with a name and a location.
Such sophistication as applied to every little piece of information
is unique to the modern information system. The ability run sophisticated
automated software processes against the text is nearly effortless;
data storage is so cheap that space used by composed text has negligible
cost.
Exactly how all this data is stored and
accessed is highly optional, whatever works best for the benefit of
the system, and its members, is usually adopted. An important
factor is privacy; group members may want to keep their drafts and personal
documents very private, possibly on their own machines. This is,
of course, a personal prerogative and a valid choice. Since the
community software provides support to the user, it may make sense for
the group members to run web based software on their own computers,
a personal and local website as a subset of the community website.
An arrangement like this would help greatly if members are participating
in the development of the community, also it may provide redundancy
for the centralized web site in event of technical outage, or some other
problem. Users may want to provide their own, distinct, web services
themselves.
Linking contextual objects, consensual
?? Objects can relate themselves to other objects, but to the
target objects reciprocate, do they want to relate to the actively linking
object. Linking services may assign links, but as services decline
in meaning, becoming decentralized and ubiquitous to each person's node
as it completes in meaning. Beyond any specific linking, linking
simply based on reflections (reflective profiles) provided by each author.
That only implies reflecting original information, but the same reflecting
rules would apply to information that has been absorbed, not just created.
Discussion: Understanding, Empathizing
Threads, modeled after Care2
Interest groups are formed by someone
who becomes the owner. Within the group, members join and they
can start discussion threads; owners can manipulate the discussions
and control the membership. There is a small space where the site
can be customized, and additions like an area for RSS feeds. Many
activists resist Care2 because it is commercial, and relies on advertising;
the advertising is fairly obtuse. Individually, people have profile
pages, which they can customize somewhat; they are similar to sites
like Myspace but far less spectacular; technologically speaking, Care2
is far behind commercial sites, and less responsive than member run
sites. While the site is primarily a discussion site, much of
its attraction is in fact the commercial side; people with sympathetic
leanings who are still mainstream in nature enjoy the ads and the online
shopping. Also, dating is a attraction, in a way not too much
different than other dating sites, except that there is an expectation
of more awareness and tolerance than found in most of society and the
online world. There is a vast variety of discussion groups, there
is something for almost everyone on Care2; its better experienced than
described. Certain groups I have belonged to have been sensitive
discussion between people who would certainly be committed friends in
real life, but because of distance and differences in social status,
probably would never meet.
Care2 is a new economy project in everyday,
altruism is mixed with commercialism exactly as new economists proposed
that the free market be channeled to create a better, more efficient
world. Of course capital thrives on exploitation, the new economy
concept was fantasy for the most part, but Care2 survived easily, probably
because it was founded at a time before the so-called angel investors
created the environment of high risk for complete control.
In the two years I have hosted discussions
on Care2, it has gone through various changes, none of them good.
They redesign their pages with incremental improvements; they completely
ignore the creative freedom given by blogging and personal sites such
as Blogspot, LiveJournal, and mySpace. The pages are completely
static, with small choices available. It is almost as if users
are stuck in rural high schools, while the rest of the web world has
moved on to art academies in the newly cultural small cities.
The Care2 discussion model is fairly
one-dimensional, yet effective. Probably Care2's biggest strength
is its community nature. The majority of members are animal lovers
of various degrees, and all members support the environment, though
sometimes in contradictory ways. There is no libertarian activist
support for corporations, such as you find for Microsoft in Usenet discussions
in Linux groups. No one calls corporate support liberalism, liberals
are still leftists; President Bush is a neoconservative on Care2, not
a neoliberal as he labels himself. Despite having an environmental
core, there often seems to be more argument than agreement on Care2.
Owned by the Care2 corporation, and closely
linked, is the Petitionsite.com. Discussion can lead to action
through the development of petitions. Traditionally, petitions
have had little meaning on the Internet; they have usually been generated
on the Usenet and passed by email from person to person, who each add
their name. Lacking any foundation nor verifiable source, they
have always been ignored. The Petitionsite is effective though,
to the surprise of the rest of the Internet. Petitions generated
by Care2 users have between a few hundred to a few thousand signatures,
really entries. They are verifiable people, and there is an area
to put in comments and also a photo. During the peak of the Katrina
suffering, many petitions were generated by Care2 users through the
Petitionsite; the effect can only be guessed at, but changes in government
policy by military types, who are usually intractable, were rapid.
Petitions backed up by phone calls and faxes, especially to Congress,
are among the most effective political tool available to the average
citizen. Recently, software was implemented on capital that
will filter out emails and petitions, it could be that authoritarian
representatives and bureaucrats are chafing at Internet democracy; there
can be no possible network security threat from an Internet petition.
A goal of the New Model is enable individuals
with information tools for maximal effectiveness in a personal sense
in moderate numbers focusing on individual knowledge rather than directing
large numbers of users who are only moderately involved. From
my experience maximal influence with respect to government has been
large coordinated letter mailings combined with phone calls and faxing.
The purpose of the activist mailings has been to shame the government
into doing its job of enforcing various protection laws; or to influence
it to create laws. An effective group in New York has been the
League of Humane Voters; as a block, they promise (and deliver) votes
to politicians who support humane legislation; they hinge their groups
votes very often on particular legislation, and they are non-partisan,
they support any politician who supports animal kindness legislation.
These efforts by groups of individuals are rarely unsuccessful, yet
ironically, conditions for animals and the destruction of the environment
accelerates. There is no real comprehension for the greater influences
of capital and globalism; globalists have over the past decade have
successfully confused groupings of the elites of the dominant cultures
of the world with support for the worlds poor by terming the elite consortia
multiculturalism. On Care2, my desires to help dominated cultures
everywhere have met with criticisms of racism, simply because most dominant
cultures happen to be non-white; there is a presumption among many people
that if I criticize a non-white, I must be a racist. For me, these
assertions, combined with other biases and twists of biases are a source
of group disruption, but it is pure speculation to wonder why this happens.
For the purposes of having a significant effect in a web community,
the key is to side-step these obstacles.
New Model
Knowledge Organization, Computer Organization
Computers and network communication applications
have the benefit of being relatively new contributions to the Information
Society, allowing for them applications of new ideas in both design
and construction, and also allowing a new type of culture to grow around
them.
Contrary, many people still see computers
as simple devices for their personal benefit: the primary computer unit
of our world is, in fact, called the personal computer. Unfortunately
this is the perception of people who have political control. Unfortunate,
also, is that political controllers have constantly expert fascist like
control over computer systems. The present controlling organization
in the US is Homeland Security. In the Katrina portion of this
paper, I hope I presented information to prove that this organization
is beyond incompetent--its leaders should be in prison cells as punishment
for negligence resulting in thousands of deaths. From all reports,
the Homeland Security and FEMA abuse of Katrina victims continues in
fenced camps set up for thousands of refugees from New Orleans.
In a sense, the new model of group activism
is well positioned to deal with control and abuse because of its roots
in both activist culture and the highly advanced technology and group
organization techniques of the information renaissance of the 1990s.
The inspiration of the new model is the Katrina group on Care2; the
group members are largely activists, albeit part-time, and the system
is an excellent example of enduring success of the influences of that
information renaissance: the New Economy, or technology market.
Groups
The facilitation of open and supportive
work groups was initiated by the Humanists, especially Carl Rogers,
to assist in therapy and teaching. The ideas have moved so far
into the mainstream that it is hard to find anyone who can detail the
history of the modern group dynamic. Possibly, because of the
ubiquity of small group dynamics, much of the founding philosophy of
bring people together to resolve differences is lost. The group
dynamic often becomes a tool for local, rather than national, control.
Or, in many cases, the purposes is not lost, simply perverted to create
a weapon of personal gain.
Groups give activist efforts efficacy,
but it is the inspiration of the individual that creates for the group
direction and the knowledge necessary to make the group's efforts valid
and effective. To make a new model web community that is effective,
that is more than a voice in the forest, there has to be a transference
between group and individual efforts that is supported by the web community
application design. While the design architecture does not make
the group or define the effort, there is a relationship between the
computational executions of the application's code and the efforts by
the group members. As less of the architecture is defined by the
group effort, and the application is designed to guide the effort, the
group can function at is most efficient, either in group communication
or as individuals. An added benefit of computer communication,
especially in the discussion thread, is that both individual and group
efforts can operate simultaneously. And, because activism is usually
a volunteer effort, the work group needs to be efficient so as to allow
the group members maximal time for their normal lives.
Care2 claims to have millions of members.
While that may be true, the active core of Care2 is probably about 10,000
with about 4,000 in my immediate surroundings. Within the Care2
community are sub-groups often defined by nation, and sub-groups defined
by interest, the users' reasons for being on Care2. The largest
group is concerned with animals; many are there for recreation, others
for romantic recreation. Each member belongs to a series of groups,
and it is by these groups that their geo-political (or recreational)
location in the community can be defined. This diversity in service
guarantees Care2's longevity, despite some technical awkwardness that
it has.
While Care2 is clearly successful, I
believe that an activist web community needs to be more focused on effort,
and that that responsibility falls to the design of the web community.
Care2 has the group model well defined, but other applications have
better tools for the idea development and knowledge delivery desired
by the new model. An application that can contribute components
to the new model is Moodle, a free and open online school application.
While no school uses Moodle exclusively as their campus, many high school
teachers use it facilitate their classes. It utilizes Wiki, Blog
and Discussion forums, as well as tools specifically designed for school
management. The project is written in PHP the most popular web
service language. I chose PHP for the new model so that I could
re-use as much code as possible; Moodle provides nearly all of the necessary
components to be used by the new model. All that is left is to
develop the architecture, develop the information documentation standards,
and reapply Moodle's code in conjunction with other borrowed code, and
code developed specifically for the new model.
Important to the design, is technical
contribution by the members; so they can design and build the site themselves.
The primary benefit is obvious, the members take what is called "ownership"
of the project, but the other important benefit is in the reduction
of operational costs. In fact, the whole operation should run
for free. As a not for profit organization, it will require a
board for legal status; in my experience, boards need to be avoided
like the plague. (In fact, once the new model is under way, I
will start a cause to eliminate boards from legal existence, in parallel
with de-personalizing the corporate entity.)
Basic Idea
Members find issues (or already have
issues in mind) to discuss. Generally, they have seen something
that disturbs them, and they want to help. As in Care2, a member
initiates an effort by creating a new effort. The conceptual computer
effort of creating an effort is new --> effort.
As an individual, the initiator collects
information and presents it in such a way as to attract other activists.
For this the initiator needs a desktop to create, an area to store information
collected on-line, an area for information scanned into her personal
computer. This repository has to be divided into personal and
public areas both for privacy and for the concepts of scaffolding.
If the initiator's ideas attract other
activists, and they personally commit to investing time, then a group
effort is initiated. If not, then the effort is an individual
effort, where others may contribute comments or suggestions without
committing to creating knowledge to deliver to congress.
Within the initiator's tool set, there
should be no reason why every compositional tool ever invented cannot
be accessed to sophisticate the initiator's presentation. This
includes code building tools, graphics manipulation tools, and any kind
of text editor. Since most of these tools are in existence as
applications, the code within them needs to be converted into loadable
modules to be integrated into the desktop of the initiator.
Here, I introduce what I call the ThinMan
model. ThinMan was my attempt at personal success during the new
economy. I looked at the Nintendo 64 child's computer gaming console,
which uses a 64 bit path (soon followed by a 128 bit path), and compared
it with the 16 bit Microsoft Windows system in my corporate insurance
cubicle. In comparison to the Windows computer, the Nintendo
was factors faster and eons ahead in sophistication and service to the
user. My first idea was to use little gaming boxes, like this
one, with the quickly growing Linux operating system. I developed
a marketing plan similar to the Swiss Swatch idea, where a new box would
come out every month; to connect with the youth, I would have contests
to see which school design team could create the best box exterior.
Ideally, the boxes would become collectible; they would also be upgradeable.
The most important development of the
ThinMan model was not in marketing (though I still believe the Swatch
concept is brilliant). I developed a component model based on
the Perl CPAN. The ThinMan user's desktop would be central, and
modules would load onto the computer from repositories to give the user's
desktop capabilities. It would allow the combination of tools,
and subsets of tools, right into the contextual application of the user's
text. For instance, an art application has a palette, a word processor
has text tools, and a browser has a top navigation bar. The ThinMan
users desktop window would have both available when needed, there would
be no separation between applications; tools would surround the contextual
effort as needed, and be discarded when used. The open and free
software model, would allow the code of the modules to be loaded from
any source: a repository or a nearby ThinMan. Furthermore, an
exciting tool like the ThinMan would attract engineers all over the
world. My enthusiasm for local youth culture world-wide would
help define the ThinMan as minimalist as possible to exert ubiquity--to
assure that any youth can have one, knowing that many youth will want
to contribute code and design features.
Extending the ThinMan modular concept
to the new model, the new model user, now in the role of initiator,
would be able to access tools through the web site to sophisticate her
presentations.
A sophisticated, yet obsolete, text editor is the visual editor, or VI. I mention it because it had an important feature, but as I think about it, the VI editor was probably the most important building tool of the modern Information Society.
It was with the VI editor, that
all the original Internet applications were written. VI was probably
developed in the Bell Labs group in workshops in New Jersey (an intellectually
free and open scientific society within the repressive monopoly of Ma
Bell), but it is a public domain project with a huge following.
None-the-less, its awkwardness in use makes it obsolete in comparison
to modern text editors.
An important feature of VI is the idea
of mode; the editor is either in text-entry mode, or in command mode.
When in text entry mode, all you can do is type text into the window,
when in command mode, all you can do is manipulate text. All text
editors have to have these modes, yet most try to hide the mode-switching
with the use of the CTL button, as in CTL-Z to undo a mistaken edit.
VI has two distinct modes, making it interesting, though difficult to
use for anyone used to, say, word processing.
VI's distinct mode concept is an excellent
example of how computing operates; different functions are distinctly
separate--they may produce a seamless effort, but mixing modes only
confuses things and dilutes the actual effort. When in one mode,
function best in that mode, when in another mode, function best in that
one: don't try to mix modes. unfortunately, most tools do just
that, creating isolated efforts and diluted efforts. I, personally,
would like to see the VI team abandon VI as a tool and apply VI's mode
concept to other applications and tools, particularly the text-area
used in web pages for editing text. Using now-common graphics,
it would be easy to switch between modes and to see which mode you are
in; knowing which mode you are in, and switching modes, are primary
(yet unnecessary) weaknesses of VI.
Different Modes
Receptors and Perceptors
When I was working late the other
night, I came up w/ two action agents: receptors and perceptors.
It related to the roles in group organization, where in group learning
the goal is learning.
The modes for the new model are socially,
rather than technically, derived:
The idea initiator, being contextually
linked to the developing idea, is always in one of these mode choices
and always in one mode or another. Another mode choice, though
really a layer choice, is technical or social; by truly embracing the
Information Society, one learns that technical and social are really
layers rather than modes. We occupy both these layers simultaneously;
within a web community, a member is socially attempting creation and
interaction, while being situated technically, in a sequence of code.
Technical and social describes equal,
and mutually Dependant, layers of human society. There is a tendency
to separate the technical and social layers along authoritarian and
democratic lines: nothing could be more wrong; authoritarian and democratic
are better viewed in terms of capitalistic and synergistic, or, parasitic
and symbiotic.
In the Group is Power
Effort and Architecture
Parallel concepts, Architecture supports
Effort, Environment creates efficacy, Reflects inside
Meditation and discussion separate individual
and group efforts. Once an individual has found something and
created a perception of the importance of the issue, for instance, what
effect a certain event or law will ultimately have, discussion validates
and modifies the ideas. Once ideas are in place, accepted by the
community, and the initiator is recognized for the contributive inspiration;
then the group can start to create an instrument of change; a block
of knowledge to force into the pubic consciousness, or onto the desk
of a powerful politician. The effort is linear; once events are
observed and a valid perception is developed, the effort is directed
to righting a wrong and the instrument is delivered; the hurt are being
helped. If a problem persists (the Katrina group was able to resolve
the types of issues it was designed to focused on in mere days), then
more information is collected and delivered, more effectively and more
widely, until the group wins, or at least recognizes that extended efforts
will not provide results, at least using the new model design.
Short Lived Phenomenon, Discussion is done, group dissolves
While the new model is superficially
similar to the ideas behind action research, action research uses a
cyclic architecture, to keep revisiting and improving issues.
The power of the new model is in its one-shot mode; after the hurt have
felt relief, it is time to move onto something new. While the
group developed for the effort may be cohesive and socially interactive,
the place for their communing is somewhere else in the community, in
a community area, not in the area of the development of effective weapons
of truth. In Care2, there was a tendency to try to hijack groups
for the purposes of social conviviality, and social control; at best
these efforts dilute efficacy, at worst they attempt to take-over and
re-channel the entire effort by eliminating the idea initiator, where
the initiator usually referred to as the owner in Internet communities.
The new model architecture attempts to
bypass, effectively neutralizing, negative social dynamics by narrowing
the focus of the effort on understanding particular problems and locating
persons in authority to shame into stopping the problem.
The place for social interaction is in
a social space, where common areas of discussion are situated around
members personal places of work, their desktops. The web usually
has real estate terminology applied to it to describe spaces; you enter
a chat room, or visit a website. In real estate terms, the activist
effort in the new model is like camping. Camps are impermanent,
but highly effective for fun an activity. Memories of camping
trips last a lifetime; those unhappy with camping go home, and never
go camping again. The home, on the other hand, is a place where
problems can really fester, homes are necessary but hardly ideal.
Homes are generally designed for comfort, but I find I sleep better
while camping. Homes, in my experience, are necessary for only
one thing, getting to work on time. They also provide electrical
outlets, something unavailable in the forest, and necessary for the
modern Information Society. But, with electrical outlets come
televisions: the worst black-hole of time wastage ever invented.
With homes come plumbing, a nice feature also not found in the woods.
But, the web is virtual; things like food, drink, and hygiene are often
discussed but they don't actually exist in the electrical cosmos of
the Internet.
For the purposes of the new model, I
would prefer the socialites design and build the social atmosphere;
where I would no doubt be a frequent visitor. I am personally
far more concerned about the architecture of discussion, the development
of ideas, and timely changes for the benefit of the planet Earth.
Socially sophisticating threads is hazardous
The community-group-topic discussion
architecture of Care2 is too one-dimensional; when a group becomes truly
effect, a variety of discussion types and documentation needs emerge
that have to be handled socially, usually by creating rules. In
the Katrina group, I referred to rules as guidelines or suggestions,
but one assistant of mine saw the guidance for what it is: rules.
She embraced them to a degree that the rules became the focus of the
group, specifically how she was implementing them to exert personal
control. In a generalized sense healthy groups evolve by extending
themselves in useful ways, creating venues to support their efforts.
Likewise, needs perceived by new model members can be accommodated with
modular code to create new types of discussion venues. These modules
can be implemented at any time. It makes sense to allow members
to design the discussion venues as they are needed, giving them recognition
for their inventiveness, but I will try to describe different types
of discussion threads.
Discussion Thread Design
Discussion goes from observing to understanding,
from comprehending to deeply empathizing; it is a meditative transference
of personal ideas into a public domain of discussion.
Phases
Ideas, expressed, are text; they are
often nebulous. Text evolves in context through discussion, and
ideas become more relevant in the discussion environment. Text
is an idea, or a thought, context is environmental.
In archeology:
"The position and associations of an artifact, feature, or archaeological find in space and time. Noting where the artifact was found and what was around it assists archaeologists in determining chronology and interpreting function and significance. Loss of context strips an artifact of meaning and makes it more difficult (sometimes, impossible) to determine function"
http://www.archaeological.org
Different Types of Writings
Initiating the process
Group members need little encouragement, they are motivated activists, information sharing
Teach not how to learn, but how to mediate and empathize to really have an effect, to focus information as individuals
Develop ideas individually, return to
group,
Existing thread models
Scaffolding Discussions
Discussion threads are a knowledge organization
school, they develop
critical analysis techniques, especially self-analysis. Members
leave and return threads in receptor/perceptor cycles, they leave the
scaffolding of the thread to work individually. The best ideas
are often created individually. Pairs can work on especially difficult
constructs, probably best using the telephone or working in person.
Death of a Thread
threads get abandoned when projecs but
threads provide valuable resources, and sequential discussions in threads
can provide scaffolding. Threads can be encapsulated for further
discussion, or can become materical presentation. They can even
initiate new projects.
Beyond the Discussion Thread
Text can be thought of as words created
of letters, but they, by definition more than that, the word text derives
from the concept of weaving. Text is push-thought. But,
to become valid in society, ideas (push-thoughts) need to move into
some kind of context.
"Textual knowledge is that relevant to understanding of grammatical aspects of the language"
"Contextual knowledge means the
awareness of inter-sentential relationships and the cumulative impact
of all preceding text"
Going beyond that, many define further
steps ideas can take into their environments, usually referring to them
as something like extra-contextual. Language moves from statement
units into a context of surrounding statements, and finally into a social
environment to gain real and relevant meaning.
"elements that exceed lexical
definitions, sentential rules, and compositional principles" exist
in "social structures, cultures, expectations, values, behaviors,
and language use."
In constructively, the extra-contextual environment is thought of as situational. Situational learning goes a dimension beyond contextual learning. It describes the environment a student experiences as she heads out into the sea of discovery, having been released from the initial community of learning, which is usually made safe with scaffolding. She is truly at sea; her learnings lead to discoveries which in turn create many questions, confusion, frustration, and self-doubt. The process of mastery in new areas of learning is fraught with risk and difficulty.
Situational discovery provides ideas
so new that no scaffolding can possibly support them. Situated
learning is where significant contributions can evolve; where truly
revolutionary ideas and solutions to perennial problems can be discovered
and developed. Possibly, the ideal work group size for situated
learning is two, to provide mutual support. More than two may
result in a desire to turn back into safer areas of learning.
And, a solo learner will have to weather the emotional stress of self-doubt
unassisted. In project science, it is the responsibility of the
teacher to help situated students create structures for inquiry, which
may, or course, evolve into newly situated confusion. The teacher
will have to develop a sense of humor about all this.
Teachers assist students in their project
development by helping them analyze new questions, and mysteries, to
help them better design inquiry paths as their situated learning reveals
to them questions they never knew existed. In project science,
the primary goal is building knowledge development skills first, then
achieving significant understandings as contributions to community knowledge.
At a certain point, a discussion will
ideally revert into individual efforts again because the supports provided
by group discussion no longer help the creation of new ideas and significant
change. Questions will arise as the result of attempting understanding;
the process of inserting thoughts into the group context, hopefully,
will raise questions which will lead to significant new discovery.
As efforts become situated in the environment, where true impact can
be experienced, relationships may develop with the information target
audience, hopefully creating new potentials for generalized perception,
action, and societal influence.
In the Information Society itself, as
separate from human society, ideas spawn into the cytoplasm of contextual
interaction. They become situated with each other: clustered into
community by commonalities. So, where is the risk? I imagine
it is in releasing into the Information Society cytoplasm thoughts whose
ideas are so original that they can find no thoughts to link with.
But, this seems highly unlikely; the power of the Internet is to link
all of humanity. With six billion humans out there, there must
be someone thinking the same things; the cytoplasm is safe.
When is a thought, or idea, best set
free? Should it be released as an embryonic idea, or an idea that
developed within the context of a discussion (the zone of personal development),
or an idea that evolved as the result of risky situational discovery.
Could it be that within the process of the empathic delivery of influential
information, newly developed information becomes the basis of the initiation
of a new comprehension, empathizing, and knowledge delivery. Knowledge
building is the process of the creation of embryonic ideas.
The actual cumulative effect of the community
may ultimately be to spawn embryonic ideas based on questions developed
form successful previous knowledge building, an extension of the classical
scientific corollary. Resolving political injustices, the activist
task, is in effect the process of creating meaning that can be effectively
used to construct society's knowledge tree.
The Information Society, now operating
far beyond the fairness ideals of two-way communication envisioned
by the Humanist engineers Lewis Mumford and Buckminster Fuller, can
now allow activists using thought-linking to arrange their concerns
into a matrix of ideas which are not criticisms but beneficial suggestions.
This is an improvement of the traditional activist role of complainer.
It is possible that a fully sophisticated knowledge structure can be
analogized by anybody looking at a snowflake. This structure can
be laid down on top of the topography of society, which happens to be
the surface of the planet, to create a matrix of interactions that are
wholly beneficial and based on trust: a resolution to the pain caused
by the inequity of what is called free trade.
The features that make humans beneficial
(very likely the ideas developed innocently as children become situated
in society), can be accessed easily through linking. Contextual
Ideas neighboring to originally and locally developed ideas are already
familiar because linked ideas have commonalities. When ideas are
linked, thoughts can evolve conceptually based on incremental differences.
As people traverse through a (or the) matrix of linked ideas, no great
leaps are necessary. People accessing information already have
mastery of much of the neighboring information, neighboring ideas can
be easily mastered allowing easy travel onto other sightly distant neighboring
ideas.
Possibly a strong validating feature
in contextual linking, is consensual linking. As in a web based
friendship, possibly from a real-life friendship, is the idea of permission
in friendship. On Care2, you ask permission from another member
to be his friend. It is rare that friendships are rejected, but
the process requires consent. In real life, people seeing things
in each other that they like simultaneously mutually bond. Delays
within Internet communication systems, as well as difficulty in perceiving
emotions through the Internet, prevent instantaneous bonding.
But, once online friends are introduced in real life, that bonding often
occurs.
Linking takes contextual statements
from paragraphs, and bigger documents; and creates from many linked
statements, the thought basis of highly synthesized documents.
Servers are used to link contextual text, consequentially; but, only
the human can reflect on text within context, giving it meaningful linking
attributes.
As part of the service process of idea
linking (something has to do the linking, it might as well be a public
domain free system) matched linking profiles would trigger messages
seeing if mutual linking is possible. Mutually consensual linking
will certainly reinforce the validity of ideas within common areas of
comprehension. Opposing ideas, defined by mutually opposing linking
profiles, will very likely never connect, though a daring linking server
might try to interrelate opposing ideas in an attempt to seek a resolution
between them. This may, instead, result in conflict. Still,
within the area of idea comparison, and knowledge construction, idea
differences never result in anything illegal. The worst type of
Internet conflict is the flame-war, the online trading of insults: hardly
something to worry about.
Societal knowledge thus built and tested
in the activist context as well as the linking process, including consensual
linking, would likely be unassailable by the lies of deliberate disruption
and free of corruption--a significant improvement over the arbitrary
law-making and enforcement process under which we now labor.
It is only in the journey into unexplored
areas that comprehension and empathic understanding may be difficult.
This may be because many problems have remained unresolved. The
journey into discomfort, pain, and even death (the realm of the activist)
is fractured and unsupported. The human mind becomes nearly unable
to process successfully, yet activists go there anyway. The new
model exists to assure the support of an expansive community in this
emotionally dangerous area of situated societal discovery.
Thread Analysis and Presentation of Threads as Documents
When I wrote about Katrina by trying
to understand the discussion in the Katrina forum, I was first worried
that I had not enough information (I had forgotten the success of the
effort), I then found that I had a wealth of information that was both
unique in its completeness, and difficult to manage. Valuable
information was usually contained in single postings within discussions;
usually, as a single member found, and reflected on, an information
source, the group discussing the issue would move further into the issue,
sometimes the group would digress onto other topics; sometimes a member
would be inspired by the topic, yet write valuable information about
a related but different topic. We were successful in keeping most
discussion within topics, but with the use of searching tools, tangential
discussions, or even solo, unrelated, comments are all accessible.
When actually synthesizing a thesis about the disaster, only a few times
did discussion influence my thinking. I simply extracted testimony
and critical inquiry, meditated on it, and used it to develop my writings.
It is possible, because of my search statics, that my reading was not
as contextual as it could have been, it is possible that better search
tools would allow information to be viewed in the context of comparisons.
When reading the anecdotes of Texas school teachers to develop my hurricane
project science ideas for middle school students, context was everything.
Furthermore, the discussion was completely situated in every way imaginable.
Searching tools, in that case, would have only gotten in the way; every
bit of testimony was valuable.
The Katrina forum had incredible chaff
though, and sifting through it was cumbersome. Later on, when
expanding on my notes, I used search tools to bring me back to the reference
text and to find related text. While the work was difficult, and
emotionally stressful, it did not take a very long time to fully comprehend
the crisis and events that followed over months, it took only about
three days. For the volume of notes produced, hundreds of pages,
this seems a short time; this was possibly because I had already been
familiar with much of the discussion. Had I known ahead of time
the volume of the source text in the forums, I would have written a
program to sift through the material, and to identify each statement
as an individual information source. By using the source of each
statement in terms of discussion threads, and by doing keyword searches
based on sophisticated pattern matching algorithms (in the Perl language),
I could easily pull from the source information every specific issue
in context, while eliminating related but unhelpful statements, or statements
complete out of context but in the threads. I regret not writing
this program because I could apply to the entirety of Care2. I
could have used every statement on Care2 as a potential source, not
just in the Katrina discussion, and I might have also enabled researchers
to do the same with unrelated study they may be doing. Since the
new model is inspired by Care2, scripts for analyzing Care2 discussion
may be useful in developing topics in the new model community.
Certainly, a similar program could be used to search for kindred spirits
presently working on Care2, this would help the new model community
recruit new contributors.
Placing discussion threads in the time
context of events catalyzing the discussion can be difficult and confusing.
Usually discussion threads are dated by the date they were initiated,
or by the date the last post was added. This can be confusing
when researching threads, because very crucial threads that start at
the outset of a crisis will continue to be updated throughout the crisis.
If they are dated by the time stamp of the last entry, they will likely
be reviewed last, even though they are the most important discussion
of the entire situation. When researching threads, it makes more
sense to read them starting from their initiating dates, when updating
threads with new information, it makes sense to look at their last postings
first. But, when simply trying to scan a threaded information
source to get a general perception, say, at the beginning of a study;
it makes sense date threads based on the peak of activity discussion
within the thread. Then, you may see a discussion that is different,
than the initiating topic, discussion which has resulted from an evolution
of understanding. Or, you may find a voluminous and unrelated
personal argument that skews the meaning of the thread. Discussion
activity tends to follow a bell curve when analyzed. It usually
takes a day for a group to grasp a situation, then maybe a day to find
relevant text to contribute and comment on, then the action phase make
take a day or so. If the topic is important, there will be decreasing
discussion for a period of days or months. However, commentary
or discussion posted main activity period tends to be the most valuable;
there has been time for considering the issues and finding the truth.
Often, two or more equally valid perspectives will be developed interdependently
and collide, respectfully and maturely, late in a discussion.
The mindset and actions of the New Orleans mayor during the crisis,
especially with respect to school buses that photographed under flood
waters, will, for a long time, be an issue of discussion. Probably
the best way to analyze text is to utilize various search strategies,
and when interesting text is found, highlight it and add tags to it
within the context of its discussion. Then, later on, use the
tags to reference the text. In this way, no information gets left
behind, and all of it can be referenced within its context as new information
or ways of viewing the information come along. Keeping relevant
information to a minimal number of references prevents extraneous information
from diluting it. Keeping the information referenced but in context
prevents it from being comprehended out of context; prevents the past
from being rewritten for whatever reasons.
Ultimately, a well mapped discussion,
with pointers to discussion text based on topic analysis will be useful
to any readers seeking to develop for themselves a perception of events
as they were comprehended and empathized by the discussion group members.
This removes the need for mediating analysis, but strengthens the responsibility
of discussion analysis. Writers using the discussion material,
can focus on reflection, and subjective understandings, simply recounting
facts and events.
A value of the Internet, is that the
same material made available and delivered on one day, can be updated,
and improved, the next, and so forth. The delivery date is not
the date it reached the web, but the day that an audience member read
it. Keeping resource information well mapped allows timely updating
of information. Print information, and emails, cannot be updated,
and even when updates are issued they often don't reach the full audience.
Much, if not most, knowledge and meaning of information is never delivered;
the Internet is creating many new meanings for knowledge dissemination.
The Server
Every aspect of the system, should
be within everybody's reach. To be truly democratic to the individual,
the entire community should be understandable by a unexceptional member.
Effectively, an individual, by definition, should be able to maintain
service either on their own computer, or the community system.
The server implies a central machine
to which everyone goes to be connected; even though that is how it works
in the WWW, the client / server model never meant for a person to be
a client, and an organization to be represented by a server. Yet,
that is how it is done. A server really is a computer process
that is on all the time listening for a connecting call from another
computer. That connecting call is made by a client program.
That is all that means; but, the terminology has determined how things
work. Another, parallel, technical terminology is more repugnant
in implication, master / slave, survives for reasons unknown to me.
The reason for the survival of the master / slave terminology is possibly
connected to the mis-meanings of the client / server technology.
The Internet is a highly distributed
model, the idea is that each user is a provider, some one who serves
information. A knowledge builder and provider, has on her desktop
a connected machine with services. The man who invented the web
server, Tim Berners-Lee, developed such a system for himself so that
he, and the world could provide information. The idea that services
would be held centrally to be supplied by controlling organizations
is contrary to the meanings of the Internet. Because the Internet
became privatized after being the property of the people (in the US,
at least), only organizations that could afford Internet connectivity
could provide Internet services. Some lucky technologists working
for beneficial organizations were allowed to run their own services,
creating a model for democracy. With the arrival of broad band
services, users can now operate web servers from their homes, albeit
against the rules. Also, ISPs sell website space for as little
as $40 per year, enhancing Internet democracy. All these services
offer PHP functionality, the most popular web service technology used
today; the PHP technology is free and openly available. Web services
are notoriously constricting, though. To achieve true Internet
presence, operating services from a broadband connection makes a user
truly part of the Information Society.
In the new model, the initiation process
of a statement emanates from the initiator's desktop. This can
be from their own computer desktop, or a remote server providing desktop
features. Privacy may be a reason why a new model site member
may want to work entirely at home, or the community member may be isolated
from the Internet for at time, yet wants to keep working with all technology
designed to enable her. A member may also be a developer,
contributing to the technical design of the community site, and may
therefore have to work in isolation, until her contribution is fully
functional. To be able to provide all the functionality of the
web community on both local machines and central supporting machines,
both types of machines might as well be running complete copies of the
server software. Today, by using virtual machines and scripting
languages, the physical size of server software is very small.
Also, the code behind a web server can be very small. The many
people who are running the Google desktop software on their computers
are nearly all unaware that they are running web servers locally.
They are, and there is no reason why local machines cannot function
to provide web services.
A fully distributed model of web services
enhances democracy in other ways. By having services on so many
machines, centralized control is frustrated by governments, and the
corporations and consortia who see themselves as governments.
The first truly distributed services sharing model is the MP3 revolution
where each person looking for music, is also offering files to share,
usually music. Not only that, they offer information about other
computers offering files to share. The system in entirely diffuse
and evenly distributed; members of this huge network generally feel
good about sharing, as well as downloading. In fact, the level
of sharing can often be predicted by the type of music involved.
Opera is often nearly impossible to download, opera fans like to obtain
music but they are far less likely to share than contemporary popular
music fans. (MP3 reference) The political strength of the distributed
model is undisputed; despite operating in defiance of copyright laws
(felt to be unfair by many) only centralized file sharing services have
felt lawsuits.
The key to operating distributed community
services is in being able to blend all the distributed information and
redistributing it so it is available, instantaneously, from a variety
of servers, if not all of them. Work on personally
used servers resynchronizes with centralized, or upstream server servers,
so that groups can get the full picture of individual efforts.
Individual small servers may actually be community systems in parts
of the world not well served by the Internet (in fact, most of the world
is still unserved democratically). Those small community servers
may synchronize with upstream servers through store and forward systems.
Their information may wait for an overpassing satellite's or commercial
plane that has Internet radio equipment, to synchronize. Also,
systems may rely on atmospheric phenomena to "skip" their
upstream synchronization through the ionosphere. These technical
ideas are closely linked to the ThinMan model, where the technical functionality
is closely linked to the subjective uses of the systems, and both are
closely related to the hardware to assure as efficient use of system
availability to assure democratic distribution of new model resources.
If the community is small, all the information
can be kept everywhere, that is all the information members individually
feel they want to share. A large community will have to be more
selective for the sake of technology. Rather than having information
on every server, servers can have pointers to information on them, and
in some cases act as proxies for distributing information to get around
restrictive government or corporate Internet blockages.
The blending architecture requires a
hierarchal system that is democratically developed. The more straight
forward the design, the easier it is to maintain, and the more effort
can be put into the subjective work of building meanings in humanity.
The supported discussions, the scaffolding as applied to knowledge building
has to be straightforward and intuitive. But, when thoughts and
statements leave the scaffolding, when the binding force changes from
group discussion threads to natural linking in the cytoplasm of the
Information Society, the structure changes from a hierarchal structure,
pyramid in shape, to something resembling spaghetti. While we
can easily navigate a discussion system to find interaction, we will
have to trust that linking technology will expose us to the types of
information, and authors, who we need to interact with to help us expand
our meanings.
Complex Structure Design
5 layers of 5 categories, Absorbing
knowledge for storage
Fairly large informational structures
can be built, and people can easily understand them, if they are built
in layers. In working with system information, a human-unfriendly
kind of information, I was able to simple to navigate, yet highly complex
data structures by building complex structures five layers deep each
with five categories. 125 individual areas of knowledge can be
located from a single point of entry into a data structure with only
five choices being made five times. The choices made are highly
intuitive, and they are made only five times, giving nearly instantaneous
access to a large array of information. Knowing how comforting
this should be, many if not most community members will be unlikely
to leave the predictability and comfort of data organized in democratically
developed complexity for the unpredictable complexity of random joinings
of meanings situated in the cytoplasm. But, many will, and as
ideal community learners, they will bring into the structure meanings,
and even possibly raw technology.
The information held in Care2 discussion
groups is two dimensional in comparison to a five layer system.
By lacking complexity, it becomes far more random in nature, and it
lacks linking support, or course. Absorbing Care2 information,
or any other obtained information, is doable when categories can be
created by understanding the specific factors influencing particular
events. Linking within the structure is different than openly
situated linking, because the links are created after generalized information
has been absorbed into a layered information structure. These
links are structural links, and may give rise to alternate generalized
categories of information, or the newly linked relationships may simply
be mapped by concept mapping structures. This kind of linking
is nearly unrelated to openly situated linking in the cytoplasm, though
linking in the cytoplasm may at some point provide knowledge that can
be absorbed by a generalized layered structure. Conceivably, also,
information stored in generalized layered structures can be aggregated
and profiled and released for linking, but thusly synergized data is
mechanical and completely dissimilar to the ideally inspired knowledge
behind the release of push notes seeking community of other inspired
constructs. It is likely that synergized information generated
from generalized structures will rejected in a consensual contextual
linking arrangement. By protecting themselves thus, thought clusters
and neighborhoods built of knowledge can protect themselves from the
repugnant influences that plague both the human subjective and technical
aspects of Information, such as Google advertising.
Document Model
Different Types of Writings
Messages and responses, Statements,
Position Documents, Significant Information Contributions
Key to the creation, the intuitive categorization,
the attractive presentation, of new model information is the design
of the expressive document, and the organization of documents into web
page presentations that are roughly representative of the technical
structure of the new model community. While the XML standard is
designed to provide document structure, it is complicated, and in of
itself, does not provide a web page. CSS, or cascading style sheets,
as a publishing tool creates beautiful pages in ways so flexible that
same page can be published with different CSS control files to look
like any variety completely different pages. CSS also offers object
orientation in a simple way, which can reflect the structure of the
community and the presentation ideas. The class definition is
represented by the <DIV> function. Another benefit of CSS,
is that it does integrate well with XML, should XML ever be adopted,
and it works with XML variants, such as MathML, an important HTML variant
when working on scientific topics, which is a likely possibility in
the community. Simplicity, functionality, and organization are
important in document standard design to encourage community members
to take ownership of their technology, and improve on it.
In the <DIV> structure is a class
attribute to associate all the text in that <DIV> area with specific
formatting controls. which is linked to controls.
The class attribute describes the type of text being written in simple
terms, such as heading or paragraph. It also describes types of
writing, such as an abstract or the body of a statement. The smaller
meanings meant to format text, are encased in the bigger meanings which
place bodies of text into a space which gives them a meaningful format
and pleasing appearance. Larger meanings, describing the types
of writing, such as discussion threads, information sources, or delivery-ready
knowledge constructs, places them correctly in the web page. The
control files which format the page based on the class attributed in
the <DIV> structure, determine where everything goes and how it
looks much like a newspaper. The same information, even they same
whole web page, can have different appearances and arrangements depending
on its context and intended audience. Furthermore, individual
readers can influence page format by specifying their own control files
that they have created. This shows the importance of having individual
web services accessible to every member, presumably running on their
own computer, or within a community computer installation. In
the end, anyone seeking individual expressions in the web presentation
will have to learn to code in text, and possibly learn how to program
in the language the web server. To make this possible, and even
popular, the languages have to be simple. Creating simple languages
is a challenge in of itself; the effect of the web has been to over
complicate technology, possibly for profit reasons. It is nearly
impossible to get a simple answer to a simple question, let alone a
complex one. Often, technology solutions are best developed by
youth, who are open-minded and original knowledge in a community context.
Topology tends to come in Triads and
Efforts in Pairs
Topology is the geography of the new
model site. Effort is the community; the community exists on the
site as an agent, a resting member is really absent. Added to
the list of comparative pairs is active and absent. Only in the
delivery of a construct to those in authority, or the departure of a
thought to situated linking, is there a third effort, but both of those
relate to the space outside the cytoplasm of the community, they are
from the new model's perspective outer-space.
Topological labels:
The goal is to create cascading (CSS)
class attributes that define text, in context, and document type (or
location). Likewise the class attributes must relate to zones:
what is personally private; what is group shared, and what is available
to the world. In effect, there are only three layers in the cascading
style structure, only three formatting control files are necessary within
a page. And, equally simple, there are only two layers within
the three structures: the level of evolution of the text and location
of the text within the community (or out in the world).
The five and five pyramidal structure
size is nowhere near exceeded, we have three and two (or two and three,
depending on perspective) when controlling document appearance and style.
The inception and evolution of the design style will, very likely, influence
the actual structure of the discussions, as discussion and presentation
are inherently subjective: function will follow form. In the new
model, only supporting evidence is subjective; and, even in evidence,
there will be alternate truths being presented.
Text Linking
The linking concept is far more complicated;
there needs to be a linking tool set. Postings to threads, or
statements, need to be profiled by the user; the discussion is only
one linking attribute. By highlighting text, and applying relevant
links to that text, the text becomes in itself a unique statement. Swaths
of highlighted text can be joined in linking separate from their contextual
statements, with the danger of the high lighted text losing its original
context. But, linking as situated is risky.
Information sources can also be useful
for linking, as the linked text can inherit the reputation of the cited
sources, raising the validity of the text, and its attractiveness to
outside readers.
Linking attributes:
Linking within the Community
With a linking toolkit, members can easily
create linking profiles for their text; likewise, other members, implicitly
having consent, also can add linking attributes. Linking does
not change text, but it can pull it from context changing, and probably
improving, context and meaning. Sophisticated linking can presumably
initiate new discussion, even new projects. Linking tool kits
can be connected to the messaging systems that the members use to communicate
on an ad hoc basis; the linking profiles should give members and idea
of growing discussion, as it evolves.
Social model grows
Sites, Communities, groups, topics, threads
Definitions within the topology of the
community will become blurred by linking. Also growth of the group
will change definitions. Groups and topics have been confusing
entities on Care2. I attempted to create a group that transformed
as the effects of events percolated through society, but when trying
to advance the Katrina group, which had morphed several times before,
I ran into extreme resistance. The group was not a group of people,
people groups evolve all the time, it was the group associated with
the Katrina topic, as if that group and those members are static, while
the same members move on in other incarnations in other topic discussions.
My initial idea was to move the Katrina
group with all its members into the new model community, and then morph
the Katrina group to where it could deal with the types of issues faced
in the aftermath of the disaster. There are disasters everywhere,
all the time; and, some communities experience continual (and unnecessary)
disaster. Very big groups are effectively meta-topics of group
interests; they are not social groups. To be a social group, a
group has to be focused on its members: their birthdays, personal worries,
recipes, the like. While that kind of activity is an exciting
and unexpected development of th web, it is not, and should not, be
activist oriented, even if a social group joins activist members.
Probably they are better thought of as gatherings, where politics can
be left outside, along with religion, because of the natural diversity
in activist groups. Sex and love would probably work well, though,
if drama is something you like. Among activists, the religion
of love and kindness cannot probably be kept outside; kindness and well-intentioned
actions are requirements of the full-time activist. Haters and
disruptor's do not last long as activists, unless they change.
Reaching the most needy
Most needy people in the world do not
have Internet Access, in poor nations, poverty is corruption, elite
exists to further crush the poor. The goal of the activist is
to help these people, activism in the Information Society is the only
way. The act of de-isolating the needy very often operates under
the radar. The United Nations, which opposes the natural flow
of electronic information in vi
Outside the New Model, Leaving
Activism is about personally making connections,
especially with law makers and other activists; this means working as
a respected individual benefiting from the scaffolded learning in knowledge
organization and equipped with linking-built knowledge a dimension greater
that what is available form the contemporary sources. The group
functions based on the individual effort and builds the individual with
new meaning.
Like a missionary, the new community
member leaves the group, not unlike a linking statement. The new
member becomes situated in a place she can create inquiry, facilitating
inquiry so significant that there is no reason to return to framed cognition
handed down from so-called superiors. The ultimate effort of the
new community is to validate individuals in the context of their world,
to teach communication,and to the target law-makers to do the right thing.