Technology on the Verge
Development of knowledge
Knowledge
At the root of all knowledge is inquiry. Building understanding
requires an inquiring interest in the world, and confidence in underlying
understanding that has already been developed. Ideas that have been
developed need to be proved; the best way to know that they are proved is to
know from experience. Models test principles in action, providing
benefits while being tested-- assuming the underlying principles are sound.
Understanding
Stability and success in society depends on democracy: the ability for people
to interact with their governments, and form them in such a way that life is
beneficial. Without understanding of all the interrelating aspects and
forces at play in society, no individual person is capable of making
knowledgeable and effective decisions. Because so many gaps in
understanding have allowed the democratic process to be damaged, national
leaders have been able to mis-guide nations, and even the world, towards
disaster again, and again. The development of inquiry skills and the
construction of knowledge in every person is increasingly becoming crucial to
the survival of humanity.
Group learning
Knowledge building skills are best developed in youth, and the most
important period of learning is the middle school stage. During this
period, maturing children develop their relationship with the world around
them, and it is also during this period that they will develop the skills to
be able to further understand the world. As important as anything is to
understanding is science: developing skills to understanding what is known and
to launch new inquiry. When developing interest in science for students,
it is better to allow students to form their own inquiry groups, and enable
their scientific inquiry with the types of instruments often made available
for public use in museums.
Information Society
Unquestionably today, the most effective tool available to youth to build
knowledge is the Internet. The Internet is the present incarnation of
the Information Society that has defined human society for thousands of years
by enabling the development of all of our technologies. The same
facilities that enable youth to learn languages easily naturally make them
adept at developing computer skills, and even original computer communication
technology.
Capitalization
Unfortunately an opposite current is undermining the knowledge building
benefits of the Information Society and the development of well-informed
populations. Increasing cycles of capitalization have caused successive
generations to be poorer than preceding ones. And the increased
competition for resources caused by increasing capitalization has led to the
fracturing of families, with the loss of traditional role models as children
are literally tossed to the wind.
Globalization
Beyond the erosion of the population base by the cycles of capitalization, a
continuation of the process of colonization now called globalization, seeks to
continually move both populations and resources in ways that benefit only the
elite and controlling cultures of every country. Further adding to the
stresses of alienation caused by instability are the stresses of conflict
resulting when one poverty stricken culture is forced by the globalist capital
structure to take resources from another, resulting in conflict that
consistently absolves the ultimate beneficiaries: the elite.
Historical development of knowledge
Concepts of the development of knowledge, and the implementation knowledge
during the learning process has long been developed in the West starting with
Aristotle. In the East, the Buddha implemented knowledge and experience
in ways that focused learning more emotionally and from a perspective of
cultural responsibility. Technological development has always been a
democratic process where innovation has been either driven by the needs of
humanity, as in the invention of the plow, or by the pure inspiration of
independent thinkers such as Newton, Watt, or Darwin. Or by socially
motivated scientists such as Ruth Benedict who introduced us to Synergy, and
also the ideas of concept mapping-- the technique she used to develop her
Synergy theories.
Thought linking
Concepts are conceived as thought,
developed as text, and shared contextually. In other words thoughts
become valid when they are contextualized. Beyond this, ideas need to be
exposed, or situated, within they environments they are meant to influence to
have a beneficial effect on the world; this is sometimes thought of as
super-contextualization. The root word describing thoughts that have
been written down is the same root for the word "textile." Written
expression is woven into the fabric of society to make society better.
Words, more than any other expression, have a influencing effect on society;
words combined with illustrations or music are that much more effective at
communicating and influencing. These are memes: units of human thought
and expression that can be contained and transported.
Linking ideas, or concepts, together to make them more powerful is the
most valid purpose of today's Information Society and the Internet. All
thought begins its journey into the interactive spaces of communication as
small text; it is therefore the responsibility of the Information Society,
with its information technology, so nurture these thoughts, and help these
thoughts situate into areas of need to dispel ignorance and inefficiency; and
most important, to significantly alter humanity's current technological and
cultural directions, which is increasingly suicidal and nearly devoid of
mutual caring between humans.
Linking technologies have been developed, but they are desperately
lame. They operate at the server level and, to this day, seem to be
attempting revive the technology bubble that was inflated just prior to
2000. Today the bubble is called Web 2.0, or social networking; as
expected, the new bubble, or 2.0, is deflating quickly.
Thoughts begin with the author, so it is reasonable that they author
should direct his thoughts. The text editor is not far from pencils or
pens, but text editors can utilized the most important strength of the Web,
the modification of text with hyper linking. Ideas can be given
attributes that will allow them to navigate their way through the vast
knowledge of the Information Society to conjoin with other similar, or
friendly, information. In this way, thoughts are contextualized and then
super-contextualized, or situated. They are situated in ways that
authors themselves can likewise link, and collaborate. Today's
Information Society, the Internet, is strongest in facilitating collaboration.
The text editor on the author's desktop is where thoughts and concepts
are initiated, and not the
central web or application server.
The text editor should
therefore be where the author's text is enhanced with hypertext information,
to give it the power of Internet communication. Editors are already
sophisticated, generally being referred to as interactive development
environments, or IDEs, and the very best IDEs are highly extensible and
written in the public domain. The majority of technology exists to link
thoughts to join authors in collaboration; it is just misdirected.
Systems
Operating Systems
The Unix system introduces a human communications interface with the Shell
languages. This Shell interface to the core operations of the computer
was perfected by David Korn during the initial development of the open systems
technological suite. He provided us with a friendly control structure
for the interoperable Unix systems accompanied by a sensible and effective
tool kit of small supporting programs. In Unix every aspect of the
system is represented as a file, which is an object, in a hierarchical and
accessible structure.
Objects
Object-Oriented Technology creates class structures for code, primarily in
a module architecture. Early modules, as found in the C language, are
callable shared libraries; encapsulated code can be enabled by reference
without necessitating the insertion of the written code itself. In an
object oriented language, this concept of shared libraries is extended to the
point where the entire program structure is a single class module.
Modules
In the object oriented paradigm, code is written into modules that can be
reused or replaced at will. Modules can be called from within
modules. The class structures define how modules of code are used, and
also the structures the data is kept in to be used or developed.
Modular code can be used in operating systems as well so that the capabilities
can be added to the systems, such as a system's ability to use add-on
hardware. Ideally, modules should be able to be added or removed while
systems are running.
The Microkernel: Independent
of the unstable technological and financial environments now causing distress
for the Information Society, two operating systems, the ExOS and L4, are being
developed that take advantage of modular architectures. The modularity
they use increases flexibility by allowing code modules to be introduced, and
the closeness of the execution process of the CPU increases processing
efficiency many factors.
Development Techniques
In parallel with object oriented developments, social workspace models are
being developed that empower humans within highly modular logic and data
environments. These development models, such as aspect engineering show
an evolution of the computer communications environment that parallels
developing concepts about natural human organization and developmental
education, such as Constructivism.
The Public Domain
While business has dominated the development of the computer, it was the
public domain that introduced the significant contribution to business of the
spreadsheet application. Programmers who work in mutualist environments
collaborating through the Internet write code and then release into the public
domain for the benefit of all, to increase productivity for the whole
world. This programming model significantly alters the business model of
centralized control, the monopolistic concept that knowledge is owned, that it
is a form of property. Technologists working mutually through the
Internet, or e-mutually, are not motivated by business greed; they benefit the
world by participating in the market economy in a generous, or Synergistic
way. The public domain environment so accelerated the development of
the Internet, the greatest
contribution of the Information Society yet, that the major computer
monopoly, Microsoft, was caught by surprise and forced to alter its business
strategy one hundred and eighty degrees in only a few months. Despite
the strengths of the public domain computing community over the monopolistic
culture, only a few free systems, take a dominating lead, and these are barely
known to the population. The Linux operating system combined with the
Apache webserver, serves the vast majority of web pages. The Linux
community has developed the Linux operating system
into a distributed super computing architecture that has
effectively ended the need for expensive centralized systems: the
mainframe. The leading search engine, Google, and the national weather
service, NOAA, depend on the Linux operating system to run their
supercomputing architectures.
How "the system" fails technology
Application centricty
In an attempt to mirror the thought architecture of the human in its computer
products, Apple Computers with its Macintosh, introduced an intuitive virtual
desktop where windows on the screen contain text, art, and business
applications within them. Drag-and-drop capabilities, where both data
and application instances are treated as objects, allow computer users to
associate data with applications, and to be able to bring data to "places"
such as a garbage area, or an application by dragging icons representing the
data with a mouse. Still, data is associated by its type with specific
applications, and often only one application increasing the tendencies to
monopolise the computer environment, which reinforces the tendency of
corporate executives to apply the principles of land annexation to the
development of knowledge.
Linear data storage
The most significant computer business contribution, the spreadsheet
application, allowed for the entry of numerical data in simple the column and
row structures of business and accounting tables, and added to the business
environment the significant capability of applying mathematical formulas the
numbers held in the now electronic tables. By changing individual values
in a numerical structure where the values of the numbers are interrelated by a
formula, users can see how the values of the entire grouping of numbers will
change when the value of one number is changed. This ability to apply
formulas to tabular data brings the "what if" capability of numerical
prediction to the average business person. By changing one value, or
condition, a business person can see the changes brought to all values.
Beyond this, the relational database brought all the capabilities of the
spreadsheet to the mass storage environment. Entire sets of tables
containing vast amounts of data can be updated and sequentially processed,
where each row of a table represents some item that is interesting to
business, such as an asset or a human.
Technological and Financial Collapse
In March of 2000, a series of collapses in the NASDAQ technology and
innovation market caused a world recession that bordered on depression.
Nearly simultaneously, events such as terror attacks against key development
regions, the revelation of widespread corruption in corporate structures, and
the transshipment of technology services to highly repressive nations,
effectively froze conceptual innovation. There were refinements to
technology, such as the miniaturization of digital cameras, but no significant
new technology, and ground breaking ideas that had begun to be developed, such
as the L4 operating system, have since been ignored.
Thinman
Data Centricity
When a dataset is activated on a desktop by clicking its representative
icon, it is opened in an application; the user focuses attention on some data,
yet the user is brought to an application. There is no fundamental
difference between opening an application to load a dataset into it, as
clicking on an dataset icon to work with it. The opposite of this
application centric model then, would be an data centric model. In a
data centric model, the user's clicking on data would signal the computer to
give the user tools to work with the data rather than simply making the user
an operator of an application, and hence a subordinate of the application's
developers.
When the user signals the system by clicking on data, the system provides for the user modules that represent tool sets the user needs to work with the data. If the dataset happened to be a poem set within graphics, and the user wanted to modify it, a simple text editor as well as a graphics palette would be provided by the system for the user to use within the window displaying the data. Since a window is necessary for the user to interact with the data, it makes sense that a familiar and well developed windowing technology be provided. The web browser is probably the best known of windows today, and the Mozilla, or Netscape, browser, with its Gecko engine driving the rendering process, would seem to be an excellent place to implement the data centric approach. Mozilla has plug-in capabilities which actually a provide simplified version of this model, but really only an illusion, where plug-ins convert the browser into a different kind of window application so that it can handle different kinds of data.
Openness of the Modular
Design
The proof of concept provided by the Perl
CPAN, or Comprehensive Perl
architecture Network,
distributed model provides an excellent illustration of how a system like this
would be supported. When a system's user decides that capabilities from
Perl's module archive need to be installed, code in written format is complied
as it is loaded onto the computer. If supporting code is needed to run
the requested modules, then the CPAN system recursively loads supporting
modules until the entire base of code support is constructed. The
compiling is required with each loading of a module from the archive as an
attempt to assure support for a wide variety of systems, and also as a
security measure. Since Perl is effectively only a language, its modules
are provided solely for the purposes of the writing of Perl programs, usually
for web services, or operating system support.
Perl Language and Modular Architecture
Those who are familiar with the development of the open systems during the
decade of the 90s know that Perl provided technologists with a language
specifically modeled to support systems operations, the manipulation of
character datasets, and that it was extended to provide logic engines for
webservers. By quickly developing technology to adapting Perl to the
Web, to run in conjunction with Web servers, most notably the Apache server,
Perl programmers accelerated the Web's growth far beyond any developmental
period experienced by the Information Society. In the early days of the
web, nearly all dynamically created pages were created by Perl servers.
Today, nearly all pages are provided by languages descendant of Perl, such as
PHP and Ruby. It can also be argued that the Perl community provided
Java developers with the proof of concept model,
if not the actually its
architectural model; Java now
dominates business logic on the Web. Far more significant, however, is
the proof of concept model for support provided by the CPAN modular
distribution system.
Closeness to the CPU
Perl provides yet another interesting component in its attempt to evolve: the
Parrot virtual machine, or Parrot VM. To achieve efficiency, and to
increase its support for systems and even to provide virtual machine support
for other languages, the Parrot VM utilizes a register based language which is
similar to CPU-level assembler language. This assembler-like language is
itself interpreted, and the interpreter is written in C, but since the
interpreter does not utilize stacks, the compiler necessary to create Parrot
machine code likewise need only use registers, and not stacks, making machine
code factors more efficient for the system both in terms of CPU and memory
use. All this efficiency is given to the support of the virtual machine,
where virtual machines, despite their popularity, are generally considered to
be a less efficient architecture. The relationship of a virtual machine
like Parrot with an operating system that is close to the CPU, such as ExOS or
L4, when compiled with a compiler tuned for register use, would create a
combined architecture factors more efficient than the architectures presently
available.
Sharing through Meshing
The architecture described above, here called the "Thinman model," moves the
support usually provided by information systems much closer to the individual
users actually working with information. Because the architectural trend
today is clearly towards smaller, quicker, and more mobile systems, it makes
sense for these systems to support one another.
As wireless communication
increasingly extends wire-based communication networks, this mutual support
model of individual small systems by other local systems becomes increasingly
practical and desirable. Along with the sharing of code capabilities
with the information contained within data, these small computers can support
each other's communication abilities.
Each small wireless computer within an area can act to transfer various kinds of data on behalf of other computers, creating a flexible, or fluid, network called a mesh within every location that wireless computers are operating. While the data transportation speeds decline proportionally with every link provided by a computer to extend a network, today's communication speed is so high that the mutual support of systems through code module sharing would be hardly be affected. Also, a mutually supporting fluid and open network would typically quickly connect to high speed ground-based connecting networks minimizing delays caused by the linking between systems.