One could state that Artificial Intelligence (AI) methods enable the finding of and interaction with patterns in the information available from contexts to an event, object or fact. These can be shaped into data points and sets. Many of these sets are tremendously large data sets. So large are these pools of data, so interconnected and so changing that it is not possible for any human to see the patterns that are actually there or that are meaningful, or that can actually be projected to anticipate the actuality of an imagined upcoming event.
While not
promising that technologies coming out from the field of AI are the only answer,
nor the answer to everything, one could know their existence and perhaps apply some
of the methods used in creating them. One could, furthermore, use aspects from
within the field of AI to learn about a number of topics, even about the
processes of learning itself, about how to find unbiased or biased patterns in
the information presented to us. Studying some basics about this field could
offer yet another angle of meaning-giving in the world around and within us. What is a pattern, if not an artificial
promise to offer some form of meaning?
It’s not
too far-fetched to state that the study of Artificial Intelligence is partly
the study of cognitive systems[1] as
well as the context within which these (could) operate. While considering AI[2], one
might want to shortly consider “context.”
Here
“context” is the set of conditions and circumstances preceding, surrounding or
following a cognitive system and that related to its processed, experienced,
imagined or anticipated events. One might want to weigh how crucial conditions
and circumstances are or could be to both machine and human.[3] The field of AI is one of the fields of study
that could perhaps offer one such opportunity.
A context
is a source for a cognitive system to collect its (hopefully relevant)
information, or at least, its data from. Cognitive Computing (CC) systems are
said to be those systems that try to simulate the human thought processes, to
solve problems, via computerized models.[4] It
is understandable that some classify this as a subset of Computer Science while
some will obviously classify CC as a (sometimes business-oriented) subset of
the field of AI.[5]
Others might link this closer to the academic work done in Cognitive Science. Whether
biological or artificial, to a number of researchers the brain-like potentials
are their core concern.[6]
As can be seen
in a few of the definitions and as argued by some experts, the broad field of AI
technologies do not necessarily have
to mimic *human* thought processes or human intelligence alone. As such, AI
methods might solve a problem in a different way from how a human might do it.
However
similar or different, the meaning-giving information, gotten from a context, is
important to both an AI solution as well as to a biological brain. One might
wonder that it is their main reason for being: finding and offering meaning.
The contextual
information an AI system collects could be (defined by or categorized as) time,
locations, user profiles, rules, regulations, tasks, aims, sensory input,
various other big to extremely huge data sets and the relationships between
each of these data sets in terms of influencing or conflicting with one another.
All of these sources for data are simultaneously creating increasing
complexities, due to real-time changes (i.e. due to ambiguity, uncertainty, and
shifts). AI technologies offer insights through their outputs of the *best*
solution, rather than the one and only certain solution for a situation, in a
context at a moment in spacetime.
The wish
to understand and control “intelligence” has attracted humans for a long time.
It is then reasonable to think that it will attract our species’ creative and
innovative minds for a long time to come. It is in our nature to wonder, in
general, and to wonder about intelligence and wisdom in specific; whatever
their possible interlocked or independent definitions might be(come) and
whichever their technological answers might be.
In considering this, one might want to be reminded that the scientific name of our species itself is a bit of a give-away of this (idealized) intention or aspiration: “Homo Sapiens.” This is the scientific name of our animal species. Somewhat loosely translated, it could be understood to mean: “Person of Wisdom”.
In the
midst of some experts who think that presently our intelligence is larger than
our wisdom, others feel that, if handled with care, consideration and
contextualization, AI research and developments just might positively answer such
claim or promise and might at least augment our human desires towards becoming
wiser.[7] Just
perhaps, some claim,[8] it
might take us above and beyond[9]
being Homo Sapiens.[10]
For now,
we are humans exploring learning with and by machines in support of our daily
yet global needs.
For you
and I, the steps to such aim need to be practical. The resources to take the
steps need to be graspable here and now.
At the
foundation, to evaluate the validity or use of such claims, we need to
understand a bit what we are dealing with. Besides the need for the nurturing
of a number of dimensions in our human development, we might want to nurture
our Technological Literacy (or “Technology Literacy”).[11]
A number of educators[12] seem to agree that,[13] while considering human experiences and their environments, this area of literacy is not too bad a place to start off with.[14] In doing so, we could specifically unveil a few points of insight associated with Artificial Intelligence; that human-made technological exploration of ambiguous intelligence.
Few people know that heart comes http://appalachianmagazine.com/2016/12/25/why-stink-bugs-are-taking-over-the-eastern-united-states/ online prescription for viagra under pressure, when stress control of the mind. The uses of alcohol, smoking, tobacco, and http://appalachianmagazine.com/2019/03/10/mountain-tradition-eating-ramps-in-springtime/ super generic viagra illegal drugs have great influence of carrying this sexual complication. You on line viagra can also get discount on bulk order for this product. While viagra lowest prices some states don’t need that this online drug is not good as their branded counterparts are.
[1] Sun F., Liu, H., Hu,
D. (eds). (2019). Cognitive Systems and Signal Processing: 4th International Conference,
ICCSIP 2018, Beijing, China, November 29 – December 1, 2018, Revised Selected
Papers, Part 1 & Part 2. Singapore: Springer
[2] DeAngelis, S. F.
(April 2014). Will 2014 be the Year you Fall in Love with Cognitive Computing?
Online: WIRED. Retrieved November 22, 2019 from https://www.wired.com/insights/2014/04/will-2014-year-fall-love-cognitive-computing/
[3] Desouza, K. (October
13, 2016). How can cognitive computing improve public services? Online
Brookings Institute’s Techtank Retrieved November 22, 2019 from https://www.brookings.edu/blog/techtank/2016/10/13/how-can-cognitive-computing-improve-public-services/
[4] Gokani, J. (2017). Cognitive Computing: Augmenting Human
Intelligence. Online: Stanford University; Stanford Management Science and
Engineering; MS&E 238 Blog. Retrieved November 22, 2019 from https://www.datarobot.com/wiki/cognitive-computing/
[5] https://www.datarobot.com/wiki/cognitive-computing/
[6] One example is: Poo, Mu-ming. (November 2, 2016). China Brain Project: Basic Neuroscience, Brain Diseases, and
Brain-Inspired Computing. Neuron 92, NeuroView, pp. 591-596. Online: Elsevier Inc. Retrieved on February
25, 2020 from https://www.cell.com/neuron/pdf/S0896-6273(16)30800-5.pdf . Another example is: The work engaged at
China’s Research Center for Brain-Inspired Intelligence (RCBII), by the teams
led by Dr XU, Bo and Dr. ZENG, Yi. Founded in April 2015, at the CAS’ Institute
of Automation, the center contains 4 research teams: 1. The Cognitive Brain
Modeling Group (aka Brain-Inspired Cognitive Computation); 2. The
Brain-Inspired Information Processing Group; 3. The Neuro-robotics Group (aka
Brain-Inspired Robotics and Interaction) and 4. Micro-Scale Brain Structure
Reconstruction. Find some references here: bii.ia.ac.cn
[7] Harari, Y. N. (2015). Sapiens. A Brief History of Humankind.
New York: HarperCollings Publisher
[8] Gillings, M. R., et
al. (2016). Information in the Biosphere:
Biological and Digital Worlds. Online: University California, Davis (UCD).
Retrieved on March 25, 2020 from https://escholarship.org/uc/item/38f4b791
[9] (01 June 2008). Tech Luminaries Address Singularity.
Online: Institute of Electrical and Electronics Engineers (IEEE Spectrum).
Retrieved on March 25, 2020 from https://spectrum.ieee.org/static/singularity
[10] Maynard Smith, J. et
al. (1995). The Major Transitions in
Evolution. Oxford, England: Oxford University Press AND
Calcott, B., et al. (2011). The Major Transitions in Evolution Revisited. The Vienna
Series in Theoretical Biology. Boston, MA: The MIT Press.
[11] National Academy of Engineering and National
Research Council. (2002). Technically
Speaking: Why All Americans Need to Know More About Technology. Washington,
DC: The National Academies Press Online:
NAP Retrieved on March 25, 2020 from https://www.nap.edu/read/10250/chapter/3
[12] Search, for instance,
the search string “Technological Literacy” through this online platform: The
Education Resources Information Center (ERIC), USA https://eric.ed.gov/?q=Technological+Literacy
[13] Dugger, W. E. Jr. et
al (2003). Advancing Excellence in Technology
Literacy. In Phi Delta Kappan, v85 n4 p316-20 Dec 2003 Retrieved on March 25,
2020 from https://eric.ed.gov/?q=Technology+LIteracy&ff1=subTechnological+Literacy&ff2=autDugger%2c+William+E.%2c+Jr.&pg=2
[14] Cydis, S. (2015). Authentic Instruction and Technology Literacy. In Journal of Learning Design 2015 Vol. 8 No.1 pp. 68 – 78. Online: Institute of Education Science (IES) & The Education Resources Information Center (ERIC), USA. Retrieved on March 25, 2020 from https://files.eric.ed.gov/fulltext/EJ1060125.pdf
IMAGE CREDITS:
An example artificial neural network with a hidden layer.
en:User:Cburnett / CC BY-SA (http://creativecommons.org/licenses/by-sa/3.0/) Retrieved on March 12, 2020 from https://upload.wikimedia.org/wikipedia/commons/e/e4/Artificial_neural_network.svg