browser console today, is more sophisticated than full-fledged Java IDE. And probably 10 times more complicated to implement too.

now you can search by typing 2 different words, e.g. “heart card”

generative art = piss splash updated with programing.

here's history of processing, the generative art programing language for non-programers. Non-programers? lol, actually, it's java.

[Research essay: The History of Processing By Maks Surguy. At https://maxoffsky.com/research/research-essay-the-history-of-processing/ ]

looks like hackernews graced me again. https://news.ycombinator.com/item?id=16741116

thanks to https://mastodon.technology/@danielhglus for shoutout

Am beginning to think, elegance of code should be ban'd. Because, often, they make code slower, harder to understand, and also far more time consuming to create.

It's like artwork. Often, they serve no everyday value. Prime example is modern geometric furniture.

However, a society where concept of code elegance doesn't exist, then code quality will suffer in general. It'd be like a society sans notion of beauty.

what should be done, is that code elegance should become a branch of math. Like, groups represented by generators and relations. So that, code elegance becomes precisely defined, and part of representation theory. Just as complexity theory is the study of algorithms.

follow me on my new mastodon account at https://mstdn.io/@xahlee and on reddit https://www.reddit.com/user/xah

for my patreons https://www.patreon.com/posts/17567297

Unicode Circled Numbers ① ② ③ (major update)

Standard Fonts on Linuxes (minor update)

am going to start to post to patreon again. see https://www.patreon.com/posts/17332715

for paid audience. It's better that way, increases my quality, with a goal.

the problem just blogging or posting to twitter, is that, nobody really cares. With patreon post, there's a goal, similar to commercial org. You write, if it's good, people pay. When no good, people stop.

And comments tends to be more valuable. It has a basis.

20 years of open source, has eliminated the power of individuals. Instead, you get these “we support opensource” mega corps like google and Facebook. The programers, by day work for these greed, by night sing open source, put to git, wipe out bread of small programers.

〔 Why There Will Never Be Another RedHat: The Economics Of Open Source By Peter Levine. At https://techcrunch.com/2014/02/13/please-dont-tell-me-you-want-to-be-the-next-red-hat/ 〕

〔 RethinkDB: why we failed By Slava Akhmechet. At http://www.defmacro.org/2017/01/18/why-rethinkdb-failed.html 〕

now you can play chinese chess! Play Chinese Chess here Play Chinese Chess Online

see the https://twitter.com/xah_lee/status/965774858746347520unicode 11 draft https://www.unicode.org/versions/Unicode11.0.0/

See also: Unicode Characters ☯ ⚡ ∑ ♥ 😄

one very annoying thing when coding a 2d grapher is the reversed y-axes in programer idiot's coordinate.

In svg, either u do `(ymin+ymax-y)`

for every point, or,

`g.setAttribute("transform", `translate(0 ${ymin+ymax-y}) scale(1 ${-1})``

but, your text get flipped. what a fing pain.

why programer's canvas reverse y-axes? not looking into history, but i surmise, cuz 'was just easy in the early days, like unix f and opengl f. Its ok to realize necessity, but the programer idiots is such that they see it as good. Every lang design, you run into such idiotic problems.

from unix to c to java to opengl (or whatever gl fk). and these idiots say, well but it is fast! it's efficient! the problem with these fheads is that they don't really understand math or meaning of efficiency, they just know micro-tuning and memory address.

I think the hard-ware based mindset is slightly going away today. Because , new generation dunno it anymore. And, langs like haskell golang, are teaching people to unstand the separation of efficiency and abstraction.

1 example of pseudo-efficiency trumps language design is Bitmask Used as Boolean Parameters. See

http://xahlee.info/UnixResource_dir/writ/bitmask.htmltypical hacker get confused by this, and defends it vociferously.

this happens for languages up to about 2005, but not after.

See also:

time to buy a high resolution monitor. 2560x1440. You. See more, sans fidgeting with scroll swipe.

find out your screen resolution size JS: Find Window Size

On Meta Syntax, Formal Language, and Logic (minor update)

i'd have to say, clojure lisp has failed, due to, complex intermix with java/jvm, and the clojure community as a cult refusing to listen. While, golang , @kotlin are thriving. The kotlin actually replaces clojure, with 10x more users.

it's funny, that the clojure head Rich Hickey emphasize simplicity, and makes a fuzz about simple may appear hard. But that's a tinted glass look of simplicity. Clojure, the way it forces intermix with java, is more complex than ANY programing language.

See also:

here's systactic complexity of golang.

slice type is declared like this:

`var `

`name` []`type`

but map type is declared like this:

`var `

`name` map[`type`]`type`

you see, there's a inconsistency.

That is, there's no single form for both. Basically, every semantic thing in the language has its own ad hoc syntax. The syntax is not composable.

see

See also:

- JS Syntax Soup 「p in o」
- Python 「… in …」 And 「… not in …」
- Context Dependent Syntax, Lisp setf, Python a[i]=x
- Java Array Syntax Soup
- Grammar Complexity
- Syntax, Formal Language, Pattern Matching
- Formal Definition of Systematic Grammar
- Composable Syntax
- Syntax Algebra

been posting to my reddit r/Xah. see https://www.reddit.com/r/Xah/

just trying it out. If you are reddit user, find my post on reddit as convenient, vote up my posts. If i don't see much user, i'll stop doing it.

also, since i have blogs on emacs, JavaScript, programing, and other topics, do let me know what you want to see, so i can shape up what kinda things i post there.

.@brave browser allow/deny notification bar is annoying. It pops up for fullscreen, notification, and others. If you ignore, the bar don't go away but adds up space to the top of the browser. And when there's multiple, you might click the wrong allow/deny button.

another thing @brave browser is annoying is that when you hover mouse over a tab, it shows that tab's content temporarily (and in grayish fog). Very annoying and confusing, especially if you have hover tab/dwell on in OS.

climbing the ladder.

arduous climb.

xah's edu corner extempore! episode №20180115122340 on math courses, and how to become a mathematician for programers

for young coders out there, the proper order of learning math is: highschool algebra, trig, pre-calc, calculus, linear algebra, then optional intro diff equations. These are basics. Must be in that order. These are basic needed for engineering. But are not math major basics yet.

for math major (to become mathematician), you need abstract algebra, real analysis. Order doesn't matter. After you had these, you are acquainted with math “language”, or math maturity. i.e. the way mathematician talks. These are typically 3rd year math major.

after that, you may have the following in arbitrary order: complex analysis, different geometry, topology, set theory. After studying these, you can consider yourself mathematician. You know what math is about, or know where to go. All the above, are traditional main math courses.

for programers, you might wonder, where is graph theory, type theory, game theory, logic, combinatorics, statistics. These are not typical main courses of MATHEMATICIAN. These are called discrete math, sometimes as comp sci. (while statistics n probability r applied math)

the discrete math, they do not have the elaborate pre-requisite sequence as analysis/algebra. Anyone can start to learn graph theory, game theory, combinatorics, number theory, etc. But you don't get deep without the years of analysis/algebra of real mathematician stuff.

See also: Free Math Textbooks

Logical Operators, Truth Table, Unicode (updated)

Programing: “or” Considered Harmful (updated)

〔 CoffeeMiner: Hacking WiFi to inject cryptocurrency miner to HTML requests By Arnau Code. At http://arnaucode.com/blog/coffeeminer-hacking-wifi-cryptocurrency-miner.html 〕

seems like a good article.

See also: http://xahlee.info/linux/linux_iptables_basics.html

pop stars, quietly drops out every decade or 2. And this is also true for programer stars. A star everybody knows and talks about, but you don't know or recall who's in previous decade.

when something is beyond us, we can't distinguish the level. e.g. some are millionairs, but some r billionairs. To us, we don't comprehend. We group them as just “rich”. To them, it's 1 thousand times diff, literally. This goes for pop stars, to politicians, businessman.

Similarly, in programing, to outsiders, we all smart nerds beyond their comprehension. But to us, there r script kiddies, and “web designers”, to those who wrote google search engine. This applies to any field or community u r not familiar with. which basically means, everything.

just read a few articles/papers about the intel bugs Meltdown and Spectre. Meltdown require OS kernel patch, cause 5% to 30% slow down for system call intensive apps (networking/file). and Spectre is unfixable.

We are screwed. by next week, nobody will remember the problems, except hackers and gov. But have a cookie, you are screwed already anyway with ur phone and usb and ssh etc.

if you are not a computer nerd, here's the gist: hackers discovered one very big hole to your password, and there's nothing you can do.

instead of Santa Claus, the unicode calls it Father Christmas.

🎄 🎅 🤶 ⿅ 𐂂 🦌

see Unicode Search

so why Unicode call Santa Claus as Father Christmas? Wikipedia has history https://en.wikipedia.org/wiki/Father_Christmas

but be warned it probably contains biased writing.

this Wikipedia from 2005 gives a easier to follow picture of history of Father Christmas https://en.wikipedia.org/w/index.php?title=Father_Christmas&oldid=33065629

quote:

Father Christmas is a name used in the United Kingdom, Australia, New Zealand and several other Commonwealth Countries, as well as Ireland, for the gift bringing figure of Christmas or yuletide. Although Father Christmas, Saint Nicholas and Santa Claus (the latter deriving from the Dutch for Saint Nicholas: Sinterklaas), are now used interchangeably, the origins of Father Christmas are quite different.

Dating back to Norse mythology, Father Christmas has his roots in Paganism. Before Christianity came to British shores it was customary for an elder man from the community to dress in furs and visit each dwelling[citation needed]. At each house, in the guise of "Old Winter" he would be plied with food and drink before moving on to the next. It was thought he carried the spirit of the winter with him, and that the winter would be kind to anyone hospitable to Old Winter. The custom was still kept in Medieval England, and after a decline during the Commonwealth, became widespread again during the Restoration period. A book dating from the time of the Commonwealth, The Vindication of CHRISTMAS or, His Twelve Yeares' Observations upon the Times involved Father Christmas advocating a merry, alcoholic Christmas and casting aspersions on the charitable motives of the ruling Puritans.

He was neither a gift bringer nor was he associated with children. During the Victorian era when Santa Claus arrived from America he was merged with “Old Winter”, “Old Christmas” or “Old Father Christmas” to create Father Christmas, the British Santa which survives today.

suppose you write a chess program. And by brute force, you completely solved chess. That is, you've determined, the optimal move for every position. That is, automated theorem proving.

That, is the idea, and beginning, of automated theorem proving.

of course, we cannot brute force all the way, since there are more ways than we can fathom. Therefore, we try to cut corners, and be smarter, in our ways of enumeration, such as the neural networks of AlphaZero.

aside from that, of mathematics, we cannot even begin to brute force or neural net, since math is not codified as chess or go is. The problem, of turning a human math question into logic and into computer, is itself, not a solved problem. Before we can automate prove theorems, we need codification of math, and that's in the realm of foundation of math.

and in this realm, even though we made a lot progress, or none, relative to the cosmos, there are still mysteries and unbelievers and glory holes. We make do what we can. Thus, we have “conjecture” searchers, “assisted” provers, alterantive foundations such as homotopy type theory and such. Their meaning and context, evolves. Few, knew what they are talking about, reality speaking.

Dear Lu, here's a problem you might find illuminating.

suppose you went to RadioShack and built a tiny neural networks Artificial Intelligence software. In just 1 hour of self-training, it plays so good a tac-tac-toe that it never loses.

Now, that's some accomplishment. But, now, how to solve, say, x + 1 = 2, for arbitrary 1 2, with your neutral net?

Can your neural net solve such math problem?

source of discussion https://plus.google.com/u/0/+johncbaez999/posts/Xk36jKsosGT

ASCII Table (minor update)

just removed disqus comment on all my sites for now. They are now forcing image ads. And their ads are those low quality sensational types. To opt ad free, would be $10/month. But, comment takes 30min/day to reply, and 95% are garbage. (i have 5 thousand pages on my sites) might add back, we'll see. let me know what you think.

Unicode Flags 🏁 (major rewrite)

unicode emoji should be ban'd. Extremely annoying to show a symbol it becomes a emoji.

if you have ◀ ▶ ⏯, the last becomes a emoji.

Adding U+FE0E does not always work.

And in MacOS, it has a bug forcing emoji tiny, same size as a letter. It ignores CSS font size spec.

and which symbol will become emoji is unpredictable. On twitter, ◀ ▶ both become emoji.

ok, the whole thing is pretty fkd.

see 〔 Apple did not invent emoji By Eevee. At https://eev.ee/blog/2016/04/12/apple-did-not-invent-emoji/ 〕

and see replies at https://twitter.com/xah_lee/status/926994405046722560

the problem of computerizing math, began with: THERE EXIST ∃, and FOR ALL ∀. #haskell #coq #typetheory

Leon Chwistek (Kraków, Austria-Hungary, 13 June 1884 – 20 August 1944, Barvikha near Moscow, Russia) was a Polish avant-garde painter, theoretician of modern art, literary critic, logician, philosopher and mathematician.

Starting in 1929 Chwistek was a Professor of Logic at the University of Lwów in a position for which Alfred Tarski had also applied. His interests in the 1930s were in a general system of philosophy of science, which was published in a book translated in English 1948 as The Limits of Science.[1]

In the 1920s-30s, many European philosophers attempted to reform traditional philosophy by means of mathematical logic. Leon Chwistek did not believe that such reform could succeed. He thought that reality could not be described in one homogeneous system, based on the principles of formal logic, because there was not one reality but many.

Chwistek demolishes the axiomatic method by demonstrating that the extant axiomatic systems are inconsistent.[2]

2017-11-03 Wikipedia Leon Chwistek

Plants Emoji 🌵 🎄 🌷 (added a macOS screenshot)

post deleted

Quiz. write a function r(f,x,n) that returns a list [f(x), f(f(x)), ...], length n. write in your fav lang.

f is function (e.g. f(x) = x+1), x is a number, n is a number ≥ 1. we want [f(x), f(f(x)), ...]

#haskell #javascript #golang #clojure

Someone asked why is this useful? For example, factorial function, or fibonaci sequence. In math it happens often. Check out “logistic map” or “iterated function system” or “dynamical systems”

comment at

https://noagendasocial.com/@xahlee/98929138430987793

https://plus.google.com/+XahLee/posts/f2phScSxUrc

https://twitter.com/xah_lee/status/925538572832202752

lol.

wait, why is haskell on the left side?

Unicode search at Unicode Characters ☯ ⚡ ∑ ♥ 😄

remember, boys n girls, there's no lang that has rigorous math-like doc or spec. None. http://xahlee.info/comp/blog.html #haskell #ocaml

2, in programing, if you spend 1 min with with good doc, you spend 1 hour without, or even 10. When there's no doc, 10 days.

3, but then why programing community don't appreciate or have good doc? because

4, ① the nature of code, changes all the time. Docs usually don't keep up.

5, ② it's hard to convert how-to into what-is, the latter is math style doc/spec.

6, ③ doc in software are literally useless, in some sense. It adds nothing to the software behavior.

7, ④ programers, partly due to above, don't know how to write well.

been reading math 2 hours a day in past months. what a joy. In contrast of reading programing doc n lang specs. Programers are such idiots.

programers don't appreciate good docs. n they have this nasty concept of “grok” (from unix fkheads), n in a flash they'll tell you to dig the source code.

there's no lang in practical use that has rigorous math-like doc or spec. #Haskell? #Ocamel? lol, they've the worst “grok it” doc and spec.

yet the haskell fkheads's like, “algebraic” data structure and monoid and suff. Each one sounds like superior mathematicians. Monad ya ass.

homotopy, a continuous function between 2 functions. How can such topology, differential geometry notion, be tied to logic, set theory, foundation of math? that's the story of homotopy type theory. Absolutely fascinating.

#math if you haven't studied group theory before, do so now. Wikipedia article is very good.

after Wikipedia #math group, read http://xahlee.info/Wallpaper_dir/c0_WallPaper.html

when programers use math jargons, they dunno which side is ass, which is mouth. #haskell #lisp

if a programer mention idempotent monad directed graph, n they can't talk basic abstract algebra, tell them 2 shut piehole #haskell #lisp

programers talking garbage math jargon happens, from 1990s perl and sql to 2000 lisper homoiconicity to today js haskell category idiots.

programers and mathematicians are very distinct communities. The 2 basically don't communicate, not unlike engineers and lawyers.

mathematicians, in general look down on programing. They dunno what's a subroutine, function, object, class.

programers, usually lookup and idolize math, yet, have 0 clue. you wouldn't have a clue of math unless you had 3 years worth of undergraduate MATH MAJOR.

now n then we see hacker idiots discuss how important is math to programing. that's, like, guys in bar on the tao of quantum cosmos.

- 1, due to my public website since 1995, i've talked to lots people, coders, geeks, and many weird people. (same ilk attract)
- 2, Usually, they know me, but i don't know/remember people. (plus, they are often anonymous)
- 3, It has happened quite a few times, in argument about coding or other, something ticked me off, and my screed turned supporters/fans to stone.
- 4, am a schizoid. That basically means, loners, or, people with very little emotion. Any attachment, relationship, trouble us greatly.

you see those Google Doodle? Never, ever, click it or read about it. If you do, your brain is tainted. This is similar to never watch TV.

Google Doodle was fun in 2000s. It's casual, non-intentional. Today, it's commercialization plus propaganda.

there's a idiotic program called pngquant.

it reduces png file size by a lossy compression.

if you want lossy, goto jpg or webp

in September, i'll be blogging on my patreon account only.

https://www.patreon.com/xahlee

If you like my stuff, i hope you patreon me there.

to my patreon supporters, new article https://www.patreon.com/posts/13809835

golang's choice of tab for indentation is the correct one. However, emacs golang mode forcing it to be DISPLAYED as 8 spaces, is the most idiotic. It undo the correct thinking.

See also: Programing: Tab vs Space in Source Code

golang is truly a simple superb practical language. + Real functional programing features. And fast! Puts clojure haskell in shame.

despite my supreme love for functional programing, i'd say, clojure is a complex idiocy, on so many levels. And Haskell too.

my golang tutorial is coming in shape.

See also: Xah Clojure Tutorial

my site ranking, i think that's the highest.

find some sites you know, and let me know what you get. On twitter, Google Plus.