The 2023 MATLAB Central Flipbook Mini Hack contest runs from November 6 until December 3. Over 200 entries have been submitted in the first two weeks.... read more >>

]]>The 2023 MATLAB Central Flipbook Mini Hack contest runs from November 6 until December 3. Over 200 entries have been submitted in the first two weeks.

This year's mini hack features short animations. The contest software runs the program you submit to make an animated GIF file with exactly 48 frames and an inner-frame delay time of 1/24 second. So, your animation will run for two seconds, then continuously repeat. If you want periodic motion, you need to be back where you started by frame 48.

In previous mini hacks, programs had to be Twitter length -- at most 255 characters long. Now, the new limit is 2,000 characters. Comments and formatting blanks are not counted. Remixes and reuse of other submissions is encouraged.

Participants and other viewers vote on the submissions. There are prizes like Amazon gift cards and T-shirts. MathWorkers may participate, but not win prizes.

Take a look at the Gallery.

I find the results fascinating. There are so many different creative styles, artistic talents and programming techniques. Here are a few of my personal favorites.

**Jenny Bosten**

Jenny Bosten is a familiar name on MATLAB Central. She is a Senior Lecturer in Psychology at the University of Sussex, where she is a "visual neuroscientist specialising in colour vision." Her code for Time lapse of Lake view to the West shows she is also a wizard of coordinate systems and color maps.

**隆光 中村**

I don't know anything about this person. All I see is this name, 隆光 中村, and this ingenious code for Fireworks.

**Ned Gulley**

Ned is the long-time MathWorker who is the architect of MATLAB Central, and who, this time, is also a prolific participant. One of his more mathematical animations is Orbiting Roots.

**Eric Ludham**'

Eric is head of the MathWorks development team for Graphics and Charting. Contributions like this Blooming Rose demonstrate his artistic design talent.

My own contributions are not nearly as attractive as these.

The 2,000 character limit is a good idea. It forced me to look critically at some old code and rewrite it to be simpler and clearer.

This program for a Bouncing Bucky Ball uses the `hgtransform` object to good effect. I also think it has a nice solution to the problem facing everybody of how to retain state from one frame to the next.

Here is a link to a slightly more complicated version with one `togglebutton` that provides a random restart capability. Bouncing_Bucky.m

Chen Lin, David Wey and Vinay Ramesh are running the Mini Hack this year,

Get
the MATLAB code

Published with MATLAB® R2023a

Try you hand at a few exercises involving Exploring Matrices.... read more >>

]]>Try you hand at a few exercises involving Exploring Matrices.

I have simplified the `Qube` app by removing these four buttons.

`solve`. The <== key now controls the unscrambling operation.

`scramble`. The ==> key now does six random rotations.

`order`. I never found a satisfactory reference for the group theory of Rubik's cube.

`score`. I never found a use for the nuclear norm.

Code for `Qube` dated 9/24/2023 is included in the Apps mzip archive.

Here are a few exercises for Exploring Matrices. The answers are available at ExMatAnswers.

**1.** Compute by rows, and by columns.

$$ \left( \begin{array}{rrr} 8 & 1 & 6 \\ 3 & 5 & 7 \\ 4 & 9 & 2 \end{array} \right) \left( \begin{array}{r} 1 \\ 1 \\ 1 \end{array} \right) $$

**2.** Solve for $z$ using inner products of rows, and using linear combinations of columns.

$$ \left( \begin{array}{rrr} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \end{array} \right) \left( \begin{array}{r} 1 \\ z \\ 1 \end{array} \right) \ = \ \left( \begin{array}{r} 0 \\ 0 \\ 0 \end{array} \right) $$

**3.** What do the `m`, `n` and `p` buttons on the `Multiply` app do? What are the other buttons and what do they do?

**4.** If *A* is n-by-n and *x* is n-by-1, how many multiplications are required to compute *A x* ?

**5.** If *A* is m-by-n and *B* is n-by-p, how many multiplications are required to compute *A B* ?

**1.** What is *R*(30º)?

$$ R(\theta) \ = \ \left( \begin{array}{rr} \cos{\theta} & \sin{\theta} \\ -\sin{\theta} & \cos{\theta} \end{array} \right) $$

**2.** Explain https://xkcd.com/184.

**3.** What is the value of $\theta$ ?

$$ R(\theta) \ = \ \left( \begin{array}{rr} 0.8 & 0.6 \\ -0.6 & 0.8 \end{array} \right) $$

**4.** What is the value of $\theta$ ?

**5.** Edit a copy of `Rotate.m` and replace the house with a hand. You can use my hand or your own hand; see exercise 3.4 in Numerical Computing with MATLAB .

**1.** Show how homogeneous coordinates and matrix-vector multiplication by `Tx`, `Ty` or `Tz` produce translation.

**2.** What is the range of the rotations used by the pitch, roll, and yaw buttons on the `Grafix` app?

**3.** What color in the beacon on top of the plane? How would you change the beacon's color?

**4,** What is the function of the resolution and offset sliders for the teapot?

**5,** How many times does the bucky ball bounce off the sides of the plot window?

**1.** What is the color of central cubelet in the Color Cube?

**2.** What do the "<=" , "<==" , "=>" and "==>" buttons on `Qube` do?

**3.** What is "God's Number" for a 3-by-3-by-3 Rubik's Cube? What are `Q20` and `Q26`? See Cleve's Corner 2022/09/05.

**4.** Can you restore the following scrambled cubes with fewer moves than `<==`, the unscramble key? Use the quarter-turn metric and reset the cube with `start` or by clicking on `stack` and `Q0`. You might also want to set `speed` to 30 or 45,

`LRUDFB`

`LRL'R'`

`FLLLRB`

`Q26`

- Reset the random number generator by entering
`rng(r)`for some small integer`r`in the command window and then generate six random rotations with the`==>`key.

Get
the MATLAB code

Published with MATLAB® R2023a

I have spent much of my career working to bring abstract linear algebra and practical matrix computation closer together. This project is my latest effort.... read more >>

]]>I have spent much of my career working to bring abstract linear algebra and practical matrix computation closer together. This project is my latest effort.

Over sixty years ago, as a sophomore contemplating a major in mathematics, I took a course entitled Survey of Modern Algebra. We used a now-classic textbook by MacLane and Birkhoff that featured abstract theorems about groups, rings, fields, vector spaces and linear algebra. I remember the colorful terms *alias* and *alibi* had something to do with change of basis and change of position, but I have never seen those terms again.

The next year, I took Numerical Analysis. We did some of the homework on a Burroughs 205 Datatron and I wrote a machine language program to solve simultaneous linear equations. I was hooked.

But at the time I did not realize that the two courses were about the same magnificent object -- the *matrix*.

Exploring Matrices is a multi-media project that shows matrices in action. Short videos, blog posts, interactive MATLAB software and self-study exercises investigate applications of matrices. The material is intended for students in an undergraduate course in linear algebra or computational science. However, anyone using matrices should find topics that interest them.

The first release of Exploring Matrices has six modules. All of the modules feature animated MATLAB displays and four of the modules include interactive MATLAB "apps". The modules are:

- Matrix Multiplication
- Rotation and Scaling
- Computer Graphics
- Matrices and Cubes
- Simulink
- AI and Gorillas

An introduction and six videos ranging in length from one to six minutes, are available on YouTube at

https://youtube.com/playlist?list=PLn8PRpmsu08oGNmtBfFOmgVC0TlXDaLDJ.

The first four of these videos feature animations produced by our four MATLAB apps -- `Multiply`, `Rotate`, `Grafix`, and `Qube`. The other two videos describe two applications, simulation of control systems and neural networks for facial recognition (of gorillas).

Some viewers may just be learning the mechanics of matrix multiplication. Other viewers will have encountered it years ago. The traditional algorithm for computing the product of two matrices involves inner products between the rows of the first matrix and the columns of the second. A less familiar algorithm, which involves linear combinations of the columns of the first matrix, is often more efficient and informative. The two approaches produce the same final result from intermediate terms in different orders.

Here is one frame from the animation of these two algorithms generated by our `Multiply` app. The highlighted element in the first matrix either moves across the rows or goes down the columns.

Our first matrices are 2-by-2. We see how the matrix

$$ R \ = \ \left( \begin{array}{rr} \cos{\theta} & \sin{\theta} \\ -\sin{\theta} & \cos{\theta} \end{array} \right) $$

rotates points by the angle $\theta$, measured in degrees.

We also see how the matrix

$$ S \ = \ \left( \begin{array}{rr} \sigma & 0 \\ 0 & \sigma \end{array} \right) \ \ \ \ \ \ \ \ $$

makes objects larger and smaller.

The two can be combined with matrix multiplication. For more operations in higher dimensions, matrix multiplication provides a unifying framework.

Here is one frame from the animation of rotation and scaling generated by the `Rotate` app. The first panel displays a 2-by-2 rotation matrix, the second panel displays a 2-by-2 diagonal scaling matrix, and the third panel displays their product.

Operations with the 4-by-4 matrices that are at the heart of modern computer graphics employ a system known as "homogeneous coordinates". The leading 3-by-3 submatrix produces rotation and scaling in three dimensions. The fourth column produces translations.

Here is one frame from an animation of rotation about the x-axis generated by the `Grafix` app. This is often called "pitch". Rotation about the y- and z-axes are "roll" and "yaw",

Rubik's Cube, named for its inventor, Ernő Rubik, a Hungarian professor of architecture, is the greatest mathematical puzzle of all time. Our digital simulation of the puzzle, `Qube`, is powered by rotation matrices.

The model consists of 27 identical copies of a single small *cubelet* whose sides are colored red, white, blue, yellow, orange and green. Initially, all cubelets have the same orientation. A *move* is the simultaneous rotation of the nine cubelets in one of the six faces, by 90° or 180°, clockwise or counterclockwise. This leads to $4.3 \times 10^{19}$ possible configurations for a scrambled cube.

The object of the puzzle is to return a scrambled cube to the initial state. Most people are interested in solving the puzzle rapidly, but I am more interested in the number of moves required.

`Qube` offers animations of many mathematical properties of Rubik's cubes. Here is a frame from one of them.

MATLAB's companion product, Simulink, is a block diagram programming environment used to design and simulate systems with multidomain models and to automatically generate the code required to operate embedded processors.

Matrices are involved in dozens of different ways by Simulink, but most users rarely see operations at that detailed level. Our Simulink module shows a model of an automobile being driven on a test track and displays the pitch, roll and yaw recorded by the matrix connecting the coordinate system for the automobile to the coordinate system for the track.

This is a personal story about a project in the early stages of development.

My wife and I first visited gorillas in the Volcano National Park of Rwanda twelve years ago. An American primatologist named Dian Fossey had studied the gorillas between 1966 and her murder by poachers in 1985. Her book *Gorillas in the Mist* was very popular and was the basis for a critically acclaimed 1988 Hollywood movie starring Sigourney Weaver.

We have become good friends with the people in the Gorilla Doctors organization. These African and American veterinarians attend to the health of the roughly 1,000 gorillas in the park. Most of the gorillas have African names like "Inkundwa" and "Maisha". We envision a gorilla facial recognition system that is available on cell phones and tablets so that new guides and doctors can learn the names of their patients.

Inception-v3 is a convolutional neural network (CNN) that is widely used for image processing. We have a version of the network pretrained on more than a million images from the ImageNet database. This publicly available system knows nothing about gorillas. We must do additional training using photos of our unique subjects.

This is where matrices are applied. Training a CNN involves determining the values of thousands of weights and coefficients. The digital photos, regarded as vectors, are repeatedly multiplied by circulant matrices where each row is a shifted copy of the other rows. Importantly, a modern CNN also contains some nonlinear layers.

Here is one photo from a small test collection. Indundwa appears to have his own selfie stick.

A self-extracting MATLAB source archive of our four apps is available at

https://blogs.mathworks.com/cleve/files/Apps_mzip.m

Thanks to Jackson Kustell, Josh Bethoney and Heather Gorr from MathWorks and Jan Ramer and Mike Cranfield from Gorilla Doctors.

We dedicate the Gorillas project to the memory of Mike Cranfield, DVM. Mike was Executive Director of the Mountain Gorillas Veterinary Project in Rwanda from 1999 until 2014. Before Rwanda, he held various positions at the Maryland Zoo in Baltimore.

Three months ago, Mike sent us a disc drive containing over 14,000 photographs of gorillas he had taken in Rwanda. We are now sorting and organizing the photos to provide specialized training of the facial recognition neural net.

A month ago, Mike was hospitalized from an apparent attack of West Nile Virus. He passed away on August 27. Ironically, after years of working safely in the mountain jungles of Central Africa, it is likely that he acquired the virus from a mosquito bite at his family's cabin in Canada.

Get
the MATLAB code

Published with MATLAB® R2023a

MATLAB has dozens of test matrices. Here are a few.... read more >>

]]>MATLAB has dozens of test matrices. Here are a few.

- Random.
`A = sprand(n,n,0.25)`. Random sparse, density = 0.25. - Bucky.
`A = bucky`. Sparse connectivity graph of the geodesic dome, the soccer ball, and the carbon-60 molecule. - Wilkinson.
`A = wilkinson(n)`. Wn+. Nearly equal double eigenvalues. - Band.
`A = triu(tril(A,2),-2)`. Elements near diagonal. - Triangular.
`A = triu(A)`. Elements on and above diagonal. - Hessenberg.
`A = triu(A,-1)`. Upper triangular plus one subdiagonal. See`schur`. - Permutation.
`A = sparse(randperm(n),1:n,1)`. One +1 in each row and column. - Companion.
`c = charpoly(A); A = [-c(2:end); eye(n-1,n)]`. Traditional companion matrix. - Fiedler.
`c = charpoly(A); A = fiedler(-c(2:end))`. Fiedler companion matrix. - Hankel.
`A = flip(gallery('toeppd',n))`. Constant antidiagonals. - Toeplitz.
`A = gallery('toeppd',n)`. Constant diagonals. - Magic.
`A = magic(n)`. Magic square.

`gallery`. Nick Higham and MathWorks, https://www.mathworks.com/help/matlab/ref/gallery.html- Anymatrix. Nick Higham and Mantas Mikaitis, https://nhigham.com/2021/11/09/anymatrix>
- SuiteSparse. Tim Davis, Yifan Hu and Scott Kolodzie, http://sparse.tamu.edu
- MatrixMarket, NIST, https://math.nist.gov/MatrixMarket.

Get
the MATLAB code

Published with MATLAB® R2023a

(I have a guest blogger today. Ron Jones worked with me in 1985 for his Ph. D. from the University of New Mexico. He retired recently after nearly 40 years at Sandia National Labs in Albuquerque and now has a chance to return to the problem he studied in his thesis. -- CBM)... read more >>

]]>(I have a guest blogger today. Ron Jones worked with me in 1985 for his Ph. D. from the University of New Mexico. He retired recently after nearly 40 years at Sandia National Labs in Albuquerque and now has a chance to return to the problem he studied in his thesis. -- CBM)

by Rondall Jones, rejones7@msn.com

Our interest is in automatically solving difficult linear systems,

A*x = b

Such systems often arise, for example, in "inverse problems" in which the analyst is trying to reverse the effects of natural smoothing processes such as heat dissipation, optical blurring, or indirect sensing. These problems exhibit "ill-conditioning", which means that the solution results are overly sensitive to insignificant changes to the observations, which are given in the right-hand-side vector, `b` .

Here is a graphic showing this behavior using a common test matrix, a 31 x 31 Hilbert matrix, with the blue line being the ideal solution that one would hope a solver could compute. The jagged red line shows the result of a traditional solver on this problem.

In fact this graph is extremely mild: the magnitude of the oscillations often measure in the millions, not just a little larger than the true solution. Traditionally analysts have approached this issue in the linear algebraic system context by appending equations to `A*x = b` that request each solution value, `x(i)`, to be zero. Then, one weights these conditioning equations using a parameter usually called "lambda". We will call it `p` here. What we have to solve then is this expanded linear system:

[ A ; p*I] * x = [b; 0]

If we decompose `A` into its Singular Value Decomposition

A = U * S * V'

and multiply both sides by the transpose of the augmented LHS, the resulting solution to is

x = V * inv(S^2 + p^2*I) * S * U' * b

instead of the usual

x = V * inv(S) * U' * b

It is convenient in the following discussion to represent this as

x = V * PCV

where

PCV = inv(S) * U' * b

is what we call the *Picard Condition Vector*. Then `x` is computed in the usual SVD manner with the change that each singular value `S(i)` is replaced by

S(i) + p^2/S(i)

This process is called *Tikhonov regularization*.

Using Tikhonov regularization successfully requires *somehow* picking an appropriate value for `p`. Cleve Moler has for many years jokingly used the term "eyeball norm" to describe how to pick `p`. "Try various values of `p` and pick the resulting solution (or its graph) that 'looks good'".

My early work in attempting to determine lambda automatically was based instead on determining when the PCV begins to seriously diverge. Beyond that point one can be fairly sure that noise in `b` is causing `U'*b` to decrease more slowly than `inv(S)` is increasing, so their product, which is the PCV, starts growing unacceptably. Such an algorithm can be made to work and versions of my work have been available in various forms over the years. But determining where the PCV starts growing unacceptably (which I refer to as the *usable rank*) is based on heuristics, moving averages, and such, as of which require choices of moving average lengths and other such parameters. This is not an optimal situation, so a I began trying to redesign algorithms that do not use any heuristics.

How do we do that algorithmically? Per Christian Hansen gave a clue to this question when he said (my re-phrasing) that "the analyst should not expect a good solution for such a problem unless the PCV is 'declining'". We note that this adage is a context-specific application of a general requirement that when a function is represented by an orthogonal expansion the coefficients of the orthogonal basis functions should eventually decline toward zero. If this behavior does not happen, then typically either the model is incomplete or the data is contaminated. In our case the orthogonal basis is simply the columns of `V`, and `b` is the set of coefficients that would be expected to decline. A good working definition of "declining" has been hard to nail down. The algorithm in `ARLS` has implemented this concept using two specific essential steps:

- First, we replace the PCV by a second-degree polynomial (e.g., a parabolic) least-squares fit to the PCV. This allows a lot of the typical "wild" variations in the PCV to be smoothed over and thereby tolerated without eliminating the problem's distinctive behavior.

- Second, we don't actually fit the curve to the PCV, but rather to the
**logarithm**of the PCV. Without this change, small values of the PCV are seen by the curve fit process as just near-zero values, with no significant difference in the effect of a value of 0.0001 or a value of 0.0000000001. But in the (base 10) logarithm of the PCV these values nicely spread out from -4 to -10 (for example).

So, Phase 1 of ARLS searches a large range of values of p (remember, p is Tikhonov's "lambda") to find a value just barely large enough to make the slope of the parabolic fit entirely negative or zero. This gives us a tight lower bound for the "correct" value of p.

Phase 2 of ARLS is much simpler. Since p is the minimum usable regularization parameter, the solution tends to be less smooth and less close to the ideal solution than optimum. So we simply increase p slightly to let the shape of the graph of x smooth out. Our current implementation increases p until the residual (that is, `norm(A*x-b)`) of the solution increases by a factor of 2. This is, unfortunately, a heuristic. But an appropriate value of it can be determined by “tuning” the algorithm on a wide range of test problems.

We call this new algorithm Logarithmic Picard Condition Analysis. (If the problems you work on seem to need a bit more relaxation you can, of course, increase the number 2 a bit. It is 5 lines from the bottom of the file.) In the example shown in the graphic above, ARLS produces the blue line so closely that the ideal solution and ARLS computed solution are indistinguishable.

In addition to ARLS(A,b) itself, we provide two constrained solvers built on ARLS which are called just like ARLS:

- ARLSNN(A,b), which constrains the solution to be non-negative (like the classic NNLS, buth with regularization.

- ARLSRISE(A,b) which constrains the solution to be non-decreasing. To get a non-increasing, or “falling” solution, you can compute -ARLSRISE(A,-b).

Try ARLS. It's available from the MATLAB Central File Exchange, #130259, at this link.

Get
the MATLAB code

Published with MATLAB® R2023a

I have just returned from a one-day workshop at U. C. Santa Barbara honoring John Gilbert on his 70th birthday and his official retirement after 20 years on the UCSB faculty.... read more >>

]]>I have just returned from a one-day workshop at U. C. Santa Barbara honoring John Gilbert on his 70th birthday and his official retirement after 20 years on the UCSB faculty.

I have known John since he was a teenager.

In the late 1960's, John's father, Ed Gilbert, together with fellow mathematicians Don Morrison and Sto Bell, left their jobs at Sandia National Labs in Albuquerque and established the Computer Science Department at the University of New Mexico. Ed was especially interested in undergraduate education and led the department to early adoption of Pascal and Unix in the curriculum.

In 1972, my wife at the time, Nancy Martin, and I were seeking a university where we could both have faculty positions. UNM offered me a job in the Math Department and Nancy one in Computer Science. I stayed at UNM for 13 years, eventually succeeding Morrison as Chairman of Computer Science.

When I first met the Gilbert family in '72, both John and his younger brother Erik were undergrad students at UNM. A year later, both brothers were admitted to grad school in Computer Science at Stanford. After getting their Ph.D.'s in CS at Stanford, Erik went on to cofound a software company that produced a dialect of Lisp and John joined the Computer Science Department at Cornell.

After several years at Cornell, John returned to California and the famous Xerox Palo Alto Research Center.

Sometime around Christmas in 1988, Ian Duff, the British authority on sparse matrices, wanted to go skiing in the Sierras. Iain arranged with Gene Golub to give a talk at Stanford. I was living in Menlo Park at the time and went to the talk. So did John Gilbert and Rob Schreiber, from Hewlett Packard Research in Palo Alto.

After the talk, everybody went for coffee at Tresidder, Stanford's student union. During the ensuing discussion, John, Rob and I decided it was time to have sparse matrices in MATLAB. The first new data structure in MATLAB and its description resulted. See the links at SIAM and MathWorks.

Our collaboration on sparse matrices has led to an enduring friendship. Every year, at SCxx, the High Performance Computing conference in November, the three of us and Jack Dongarra get together. Here we are with Sven Hammarling from NAG, at SC17 in Denver.

After a dozen years at PARC, John returned to academic life at the University of California near Santa Barbara. Last Saturday, Aydin Buluc and Daniel Lokshtanov, two of John's UCSB Ph. D. students, organized the JRG70 workshop. Here is a link to the Web page, including the list of talks presented. link. It was the first time since Tresidder that Iain, John, Rob and I have all been together.

Here is a portrait of the JRG70 participants. As usual, John is being modest; he's in the back row, in a burgundy sweater.

Get
the MATLAB code

Published with MATLAB® R2023a

Special Notice If you have been following my posts about Wordle, be sure to see the puzzle in today's New York Times, Tuesday, April 4.Get the MATLAB code (requires JavaScript) ... read more >>

]]>

If you have been following my posts about Wordle, be sure to see the puzzle in today's *New York Times*, Tuesday, April 4.

Get
the MATLAB code

Published with MATLAB® R2023a

I recently had an opportunity to chat with Ernie, the Large Language Model currently under development at the Chinese internet search giant, Baidu. As I expected, Ernie's responses are in Chinese. I don't speak Chinese, so I have asked Google Translate for the response in English.... read more >>

]]>I recently had an opportunity to chat with Ernie, the Large Language Model currently under development at the Chinese internet search giant, Baidu. As I expected, Ernie's responses are in Chinese. I don't speak Chinese, so I have asked Google Translate for the response in English.

One of my questions that Microsoft's ChatGPT answered incorrectly was

Who coined the term "embarrassingly parallel?"

Ernie's response to the same question was

Goggle translate:

Who coined the embarrassing word parallel?

Well that's a **very** unfortunate misunderstanding. And, it's just repeating the question. That's an old trick; the mother of all chat bots, Eliza , used it over sixty years ago.

One of the questions I often ask when I meet someone for the first time is

Do you use MATLAB?

Earnie's reply was

Google translation is

I think I love you.

Well, that's nice, but doesn't really answer my question.

Can a chat bot assist with writing this blog? I don't expect help with the MATLAB code, or with the graphics, or with any mathematics. How about the prose, if it isn't too technical.

Here is the opening sentence of the post I made a few weeks ago.

The 4-by-4 matrices in the panels on the following screenshots are at the heart of computer graphics.

I asked Ernie how that would look in Chinese. Ernie responded with

When Google translates that back to English, we get

The 4×4 matrix in the screenshot panel below is at the heart of computer graphics.

Ernie decided to make the sentence singular, which happens to shorten it. But I am afraid that isn't much help for this blog.

I have already described my chat with ChatGPT. This Chinese competitor is certainly not an improvement. For now, I will continue to produce this blog the old fashioned way, without any "help" from AI.

Get
the MATLAB code

Published with MATLAB® R2023a

When I tackle a Wordle puzzle, I like to make all the key decisions myself. My three assistants set up puzzles and suggest words when I ask for help, but I guide the actual solution. My assistants also make it possible for me to play Wordle anywhere, anytime, even when my laptop is in airplane mode. I don't need the *New York Times* or access to the Web.... read more >>

When I tackle a Wordle puzzle, I like to make all the key decisions myself. My three assistants set up puzzles and suggest words when I ask for help, but I guide the actual solution. My assistants also make it possible for me to play Wordle anywhere, anytime, even when my laptop is in airplane mode. I don't need the *New York Times* or access to the Web.

`Wordler`, `Words` and `Wordie` are the three assistants. `Wordler` replaces the *Times* by generating puzzles and evaluating responses. `Words` provides lists of possible responses. `Wordie` handles the Wordler Window and colors the letters gray, green or gold.

`Words` has a `vocabulary` of 4665 five-letter English words. Any of them are acceptable responses. The vocabulary begins with

vocab = [ ... "ABEAM" "ABETS" "ABHOR" "ABIDE" "ABLED" "ABLER" "ABODE" "ABORT" ...

And, 584 lines later, ends with

"ZILCH" "ZINCS" "ZINGS" "ZIPPY" "ZOMBI" "ZONAL" "ZONED" "ZONES" ... "ZOOMS" ];

If you were to print the entire vocabulary with 40 lines per page, you would print over 100 pages of words.

It took me a long time to write the `Words` assistant, which is called whenever the Words button in the Wordle Window is clicked.

`Words` is supported by a seventeen-program library of functions named `Wordspq` where `p` and `q` are nonnegative integers with `p+q <= 5`. `Wordspq` finds words with `p` green letters and `q` gold letters. The programs in the `Words` library all have the same structure involving five nested `for` loops.

The last line of `Words` is

feval(['Words' p q],Gray,Green,GreenLoc,Gold,GoldLoc)

`Gray`, `Green` and `Gold` are lists of letters with specified colors and with locations `GreenLoc` and `GoldLoc`. Locating the green letters is easy because they must be in specific slots. Locating the gold letters is tricky because each of them can be in any of several different slots.

For example, this situation in the NYT puzzle described below would result in a call to `Words13` with

Gray = 'AIHEC' Green = 'T' GreenLoc = 5 Gold = 'ROU' GoldLoc = {[2,3],[3,4],[1,2,4]}

`Wordle` starts a game by choosing a secret random target from the vocabulary, or from a smaller subset about half the size. At the same time, I choose my starting word, which is usually `RATIO`. My assistants respond with the Wordler Window and a simple keyboard.

The gold `O` tells me the target contains an `O`, that it is not in position 5, and the target does not contain `R`, `A`, `T`, or `I`. I know there are hundreds of such words in the vocabulary. One of them is `DEMOS`, which I enter on the keyboard.

`DEMOS` happens to be a very lucky choice. The target has an `E` in the second slot, an `S` in the last slot, `M` and `O` in the remaining slots, and no `D`. When the answer does not occur to me in a minute or two, I click the `Words` button. The response is

MEOWS cnt = 1

So, there is only one word to choose, and it earns five greens.

Let's do the *Times* puzzle from March 23. I start with my mathematical `RATIO`. I see that the answer contains `R`, `T` and `O` and does not contain `A` or `I`.

I happen to remember that `OTHER` qualifies. It does not hit any new letters, but it places additional restrictions on the ones I already have and eliminates `E` and `H`.

`Words` now lists 37 words that I should choose from. I pick `COURT` because it contains `U`, the only remaining vowel.

`Words` informs me that there are only two possibilities left, `TROUT` and `GROUT`. I pick the one without a double consonant and it is the winner.

Here is an atypical, but instructive, example. For this puzzle I am pleased to see `Wordler` gives `RATIO` a green `A` in position 2 and a gold `R` somewhere in positions 3 through 5. I remember one of my favorite "technical" terms, `PARSE`.

To use a baseball metaphor, `PARSE` hits a triple and almost gets an in-the-park home run. Now I need to ask `Words` for qualifying responses. There are exactly two, `PARED` and `PARER`. (Both come from the verb "to pare", which means to cut the outer skin off something.)

One of the choices has a double consonant, so I choose the other one. When it doesn't fly, the only choice left earns the five-leaf clover.

How do I generate five golds? I need the starting guess to be a *permutation* of the final answer. A few moments thought suggests `TAKES` and `SKATE`. I am sure there are other possibilities. But this one is special because `STEAK` makes it triplets. `TEAKS` would make four permutations but does not meet the "hard mode" restrictions.

Over a year ago, MATLAB programs for solving Wordle puzzles were described by Adam Filion as a guest blogger on Loren's blog and by Matt Tearle with a YouTube video.

Working on my Wordle obsession has been very interesting. I have developed some useful tools and I see forgotten five-letter words everywhere. You can share the fun by downloading the code at this link and running it yourself.

Get
the MATLAB code

Published with MATLAB® R2023a

Wordle Sneak PreviewsThe Web sites that are claiming to have advance knowledge of the daily Wordle answer have become unreliable, so I am closing my sneak preview business. Get the MATLAB... read more >>

]]>The Web sites that are claiming to have advance knowledge of the daily Wordle answer have become unreliable, so I am closing my sneak preview business.

Get
the MATLAB code

Published with MATLAB® R2023a