Math problem solving and mind mapping


Here are some ideas on a new technique for solving math problems using mind maps.

What is mind mapping?

Mind mapping is a special form of note-taking. Here are some essential features:

  • You take a (preferably large) sheet of paper in landscape format.
  • You write the topic / the problem in the middle of the sheet and draw a frame around it.
  • You write the main aspects and main ideas around that central topic and link them through lines to the center.
  • You expand the ideas in these “main branches” into subbranches etc.
  • Wherever appropriate, you should use figures, colours, arrows to link branches etc.

Here are three examples from Wikipedia. Beware: I didn’t bother to check their content. I’m just interested in their different layout and appearance.

A handdrawn map:

A simpler handdrawn map:

A computer map:

How can mind maps be used for solving problems?

Let’s start with…

View original post 1,097 more words


July 16, 2018 at 3:21 am Leave a comment

Memory driven, he said

23605-004-f2b4f973Hewlett-Packard’s lineage in research and development is noteworthy. The garage where Hewlett and Packard (the people) worked to build their first electronic circuits for test and measurement still stands as something of a Silicon Valley grotto honoring futurism.

The company’s devotion to R&D came under some criticism in recent years as it has pared down divisions, lines of business, and ultimately split itself into divergent entities – HPE and HPQ.

The criticism may not be so fully warranted, according to long-time technology observer Roger Kay, although he admits some merit in a general perception that the company reduced R&D spending as a percentage of overall revenue during the regimes of Carly Fiorina and Mark Hurd. He writes: “The fact is, a lot of PH.D.s on staff were pursuing wacky science projects with not much oversight.”

Before we look at the whacky science factor or lack thereof in HPE R&D efforts under Meg Whitman, let’s looks at its doppelganger, IBM.

IBM too is has been in a paring phase for some time, and has given up on its PC business, like HPE, the renamed phoenix released from the split that occurred when ”H-P” became a printer company with PCs known as HPQ and a data center computer company known as HPE.

IBM continues to pursue a lot of very basic science as well as microprocessor technology to keep its Power chip new; although some its recent computer advances have it finding a partner, relying on an alliance with GPU maker Nvidia. HP’s PA RISC chip, once roughly competitive with the Power chip, is now a mostly forgotten historical footnote. The follow on to PA RISC that saw H-P forge and alliance with Intel to design the Itanium chip is a a historical footnote that people are still trying to forget.

Even as it has empowered an army of consultants to go forth and make businesses blue, IBM has continually sought out street cred for its high tech prowess. A slew of advanced technologies came together to form its entry into the futuristic field it liked to call ”cognitive computing.” The entry that carried that forward was called ”Watson.”

Named after the father and son who pretty much created the modern IBM (not at the same time, and not in a garage) Watson’s first appearance was as a supercomputer able to win on the TV quiz show Jeopardy. In 2011, it garnered no small fanfare.

Big Blue had a bit rougher road since, trying to position Watson 2.0 as the epitome of AI. Still, there isn’t much doubt that HPE was looking to evoke some of the same old-style new technology feel good feeling when *”the second most important supplier-run research lab in the business after IBM Research” rolled out The Machine in 2014.



July 1, 2018 at 11:50 pm Leave a comment

Synthetic nerve ciruits

June 30, 2018 at 12:33 pm Leave a comment

Patterson’s History of Computing

June 24, 2018 at 3:41 am Leave a comment

AI resurgent: Boosting chip innovation

It is has been interesting to see a re-mergence in interest in new chip architectures. Just when you think it is all been done and there’s nothing new. Bam! For sure.

The driver these days is A.I. but more particularly the machine learning aspect of AI. GPUs jumped out of the gamer console and onto the Google and Facebook data center. But there was more in the way of hardware tricks to come. The effort is to get around the tableau I here repeatedly cited: the scene is the data scientist sitting there thumb twiddling while the neural machine slowly does its learning.

I know when I saw that Google had created a custom ASIC for Tensor Flow processing, I was taken aback. If new chips are what is needed to succeed in this racket, it will be a rich man’s game.

Turns out a slew of startups are on the case. This article by Cade Metz suggests that at least 45 startups are working on chips for AI type applications such as speech recognition and self-driving cars. It seems the Nvidia GPU that has gotten us to where we are, may not be enough going forward. Co processors for co processors, chips that shuttle data about in I/O roles for GPUs, may be the next frontier.

Metz names a number of AI chip startups: Cerbras, Graphcore, Wave Computing, Mythic, Nervana (now part of Intel). – Smiling Jack Vaughan

January 21, 2018 at 6:26 pm Leave a comment

Quantum frutition

50_qubit_system_desktopIt was a phenomenon discovered 100 years ago. Now, as technology based on quantum mechanics emerges from labs, Accenture and others are setting their sights on quantum machine learning, writes Jack Vaughan. As years many years of quanta shuffling bears tutti fruity.

Click to read: Quantum machine learning may come knocking on analytics’ door – SearchDataManagement

December 23, 2017 at 1:59 am Leave a comment

Quantum ad impedimenta computing


A recent Wall Street Journal article doesn’t hold back on the hype, at least in its headline. Quantum computing, it promises, will change the world as we know it. Courtesy of Google. The story that follows is a bit more measured. The obstacles to successful quantum computing are discussed, the murkiness of the applications is considered. There is a discussion of activity of some players – Dwave, IBM, and especially Google. Also noted- The NSA is building a quantum computer too. The conjecture (ala Google’s Hartmut Neven) is put forward that the nearest closest biggest opportunity for quantum computing relates to machine learning – guess because probability is involved and the computation problems could grow unmanageable eventually. We will see.

The most obvious expectation is that the NSA is anticipating the possibility of a point where quantum computers could break important codes. And thus disrupt the present status of Internet commerce. Is that as big a threat as Hitler? I ask that because there are parts of the quest for quantum computing that bring to mind the atomic bomb program of the 20th Century.

When you look at the obstacles to successful quantum computing at a reasonable scale, you look at a problem of physics, and more. Obstacles to successful quantum computing include error correction, coherence time of the qubit (life-time), low-temperature wiring, verification of quantum algorithms, more.

It brings to mind the massive obstacles that faced the teams working in World War II to create the atomic bomb.

In search of something that worked, three methods were used for uranium enrichment: electromechanical, gaseous and thermal. Parallel efforts worked to produce plutonium from uranium using graphite reactors or chemically separation or irradiatiation and transmutaion (God bless you). This is collectively known as The Manhattan Project.

In the end they built two types of bombs – one that employed a gun-type of detonator that proved impractical for plutonium. A simpler gun using very rare uranium-238 (painstakingly separated from uranium-235) was used in the final analysis, and a couple of years later the emphasis was all about hydrogen bombs.

This is not to mention the German atomic bomb effort, which, as far as I know, looked more like a reactor than a bomb. Or the British effort, which predated the American effort. Efforts are underway today by lesser powers, assuredly using methods honed by the great powers over time. If you are a lesser power, I cadged that from the guy sitting next to me in class – so, please address all correspondence to him.

It is hard to think — today anyway — that development of the quantum computer of the 21st century will benefit from the same type of impetus the Manhattan Project did (World War). But that could change. Time will tell.

In the meantime, looking at the picture above, I think we are looking at RS-232 connectors in the innards of Google’s machine – why not RJ-11?? – Juan Ignacio Vaughan


Thank you Wikiepedia

October 19, 2017 at 1:38 am Leave a comment

Older Posts