The Spectre of Math

February 6, 2012

Diffy Qs notes at 20000 downloads

Filed under: Mathematics,Teaching,Technology — jlebl @ 12:31 am

So the differential equations textbook just reached 20000 downloads from unique adresses. The real analysis textbook is close behind (despite being a year younger) at around 19200. The rate is growing, it started out at around 200 per week for both in the fall and is now pushing 400 a week. As an overwhelming percentage of the hits come from google I think google might have ranked the pages higher. So if you want to help out with the project of free textbooks: link to the books on your blog, page, whatever. And press those social buttons on the page, I guess that also does it.

It’s also interesting to see how ipv6 is doing. So far, 82 ipv6 adresses looked at the real analysis book and 43 for the diffyqs. As ipv6 was active for about half a year on the server, it’s still a very tiny percentage. There were about 6-7 thousand ipv4 addresses looking at the diffyqs book during that time frame and about 8-9 thousand for the real analysis book. But at least someone is using ipv6 (if I could get an internet provider that offered ipv6, I’d use them, but I didn’t find such in Madison).

Advertisements

July 12, 2011

No more overheating

Filed under: Hacking,Linux,Technology — jlebl @ 7:15 pm

Pissed off about the CPU overheating, I wrote a simple daemon. It monitors the core temperatures and sets the cpufreq governor to powersave when temperature in either core is above 90 deg celsius, and sets it back to ondemand when it gets below 75 in both cores (those numbers I pulled out of my ass, they might need tuning). It simply polls the temperature every 5 seconds. There is no configuration or anything, simply change code and recompile. It’s all the neurons I’m willing to commit to this problem.

Yes I know performance might suffer since the CPU can go even further, but I don’t care about performance, I care about the machine not turning off randomly. I guess ondemand is actually better power (and heat) wise when everything is normal, but when then heat is high, powersave does come to the rescue.

Here is the code for those that want to do something similar. You might need to modify it heavily. I called it noverheatd.c, I simply compile the thing with something like gcc -O2 -Wall noverheatd.c -o noverheatd, placed the resulting binary in /root and then in /etc/rc.local I added /root/noverheatd &. The parts that need modification is set_policy where you need to set the number of CPUs your kernel thinks it has, and then in the main loop you need to set the right coretemp paths for all the coretemps you have. I had to run “sensors-detect” as root from the lm_sensors package to obtain those guys.

Update: Poll every 5 seconds rather than every two seconds.

June 26, 2011

Overheating

Filed under: Hacking,Linux,Technology — jlebl @ 5:01 pm

Hopefully I’ve solved my overheating problems with the lenovo. First using the nvidia blob seems to have lowered the gpu temperature, but it wasn’t enough. Turning off the “discrete graphics” and trying to run the thing with the Intel GPU led to scary kernel crashes. I’ve realized that cpufreq does not take into account cpu temperature (that’s kind of dumb isn’t it). The few posts I found had solutions of the form “cpus should never overheat” and “reapply thermal paste” … yeah that’s very useful. My acpi does not output temperature for some reason, though lm_sensors seem to be working, so it seems cpuspeed daemon won’t work I guess. So it’s either hacking cpuspeed or the simpler solution just lowering the maximum speed of the cpu. That seems to be working beautifully. I tried very hard to overheat it and it’s still good. I can’t really tell that it’s slower so I don’t really care.

Still I hoped this would have been solved long ago. I sort of assumed it was actually.

Before I managed to “fix it,” I have come up against the “run fsck manually” message, which I filed as a bug against fedora only to get “what did that look like and that shouldn’t happen nowdays” response. Well I am not about to replicate this as I actually need to … you know … do work. And I don’t want to end up spending the day reinstalling the computer in case the filesystem really does get hosed.

Anyway, not too happy with the Lenovo anymore. There are plenty of problems with this hardware. Given how much everyone was raving about lenovo, I expected a lot better. Next time (which given how this hardware seems to be working is going to be soon) i will buy another one of those linux preinstalled laptops. The hardware will suck, but at least I won’t have to buy windows.

I wish I could buy a laptop and have it for years. That doesn’t seem to be a possibility. First you buy a laptop and must install a bleeding edge distro to get all the hardware to work. Then by the time the version of the distro you use is reasonably mature and bug free (or you can use a different long term supported kind of distro) then your hardware breaks down, forcing you to buy a new laptop. The cycle of life!

I wish people built things that meant to last for more than 1-2 years.

June 22, 2011

Update fever

Filed under: Linux,Technology — jlebl @ 8:23 pm

Yaikes, Firefox 5 is out and Firefox 4 is EOL. Each time I used Chrome (I had to use chrome to access the webct gradebook at UCSD) it had a different version number. I can’t quite tell the difference between browsing 3 years ago and browsing now, except that Chrome still doesn’t do flash on 64 bits, and in firefox it is by running the 32 bit version of flash in the wrapper.

Whatever people are somking, I want some!

At the same time my laptop (lenovo, not too satisfied anymore, not sure if I will buy one again) turns off about once a week, possibly overheating, but its hard to tell.

Hell I want just something that works! Why do people keep adding new features that break old features, so that no matter which version of software or hardware you use you always end up with something broken.

I don’t care for the fastest hardware, I mean it really isn’t any faster that I can tell anyway than it was a few years ago. But the old hardware dies and you have to buy new hardware that requires new drivers that are broken in new ways, before they get fixed, your hardware dies again.

Software seems the same way. What happened to quality engineering? It’s a known fact that writing a new feature is 5% of the work and making sure it doesn’t break everything is 95%. Now everyone wants to just skip the 95%.

The best example of good software is TeX and LaTeX. They have not changed in … decades. Yes a new version of a macro does come out every once in a while, and a distribution will break the installation once in a while for stupid reason, but the software itself is stable and mature. I can compile a document made 10 or 20 years ago without modification. I don’t have to learn anything new. It works, and it has quirks, but it has the same quirks for everybody, so they are usually well documented quirks.

April 3, 2011

GNOME 3

Filed under: Linux,Technology — jlebl @ 8:29 pm

So just in case my last post seemed too negative: I do like gnome shell, and I am using it. I am just a grumpy kind of person, I always was. So some of the gnome 3 experience takes some getting used to, some of it is annoying, but it is kind of cool. I think it could have been a lot nicer if there were not so many on purpose annoying aspects of gnome shell. Another example: the internal microphone needs some tinkering with alsa levels. This was possible with the old gnome alsamixer, and I probably would have figured out what was going on if i had that. The command line alsamixer is too difficult for me (I can’t tell the difference between muted and not, and I have no clue how to move left/right channels separately, which is what needed to be done to make the mic to work).

My gripes are with the assumption that gnome is working on perfect hardware and only well written apps run under it. That will never happen, no matter how much we wish it to happen. 10 years ago I thought that within a few years linux experience will be 100% out of the box on almost any laptop. It’s still not there, and will never be there. That last 5% will take forever, not a couple of years. Mostly because even the windows experience is not 100% good out of the box even though it is preinstalled. I tried turning on windows first before wiping it, and it already had some issues even though it was the stock experience, but I found even the ipad to be buggy (and it made me laugh that marketa’s vista laptop has been crashing on shutdown ever since it was new, it started doing that before we even installed anything on it). I just tend to see problems with design that other people ignore for all the cool-aid they are drinking (this is especially true with apple and/or google).

Anyway, overall, I’m fairly happy with GNOME 3. And I’m sure it will get better in a few years. I’m just hoping that more essential things also get solved.

April 2, 2011

GNOME 3 experiences

Filed under: Linux,Technology — jlebl @ 7:09 am

So my Zareason notebook decided to break (actually it was breaking for a while, the case is really terrible material-wise). I’ve been looking to buy a linux preinstalled laptop, but finally saw a sale on a lenovo u460 and decided to just get it. The machine is very nice and essentially everything works. I installed newest Fedora alpha and updated to the latest bits so I have GNOME 3 here.

Experience is not entirely positive. GNOME 3 is a solution in search of a problem. The things that GNOME 3 makes easier weren’t really all that difficult before. It doesn’t really make anything important any easier. Basically it improved on one part of the desktop experience that was already “good enough.” There is nothing that a user couldn’t have done before that they can do with GNOME 3. But there are things that were possible with GNOME 2 that aren’t with GNOME 3. So this improvement is at a cost of making lots of more rarely done things much harder. If there are 100 things, each one of them only affecting 1% of the users, it is entirely possible that 100% of the users are affected. I am sure that everyone will find a couple of things they need to do (not just want to do, but NEED to do) that will be very hard if not almost impossible in GNOME 3. For example for me, linking two computers in a temporary way with an ethernet cable was not possible with a GUI anymore, and I couldn’t any more figure out how to change the mac address the network card uses in the new dialogs. Both were things I needed to do. It doesn’t help if someone tells me I shouldn’t have to do them if say the network setup (which is beyond my control) was done better.

A good UI gets out of the way. GNOME 3 more often gets in the way by making things that I needed to do harder or impossible to find or do. So while much of gnome shell is nice there are many places where it makes life harder on purpose for whatever reason. GNOME 2.0 had the same philosophical problem.

There are many places where the linux desktop is still very deficient in a way that keeps people from using it. GNOME 3 does nothing to improve that in my opinion. It’s all nice in a perfect world, but we do not live in a perfect world where all hardware looks the same, all 3d drivers work, all people work the same way and all necessary software for linux is already written.

Someone should try to fund a study to find out “why are you not using linux” or more specifically “what does linux not do that you need it to do before you will use it”. Surely it is not fixed workspaces and starting applications from a menu.

October 4, 2010

Damn Murphy!

Filed under: LaTeX,Mathematics,Teaching,Technology — jlebl @ 7:45 pm

Murphy’s law strikes again. The moment you publish something (be it a textbook or a software), you find a bug. Yesterday I put on the web new versions of both my diffyqs notes and my real analysis notes. Already yesterday I found that when I posted the real anal notes on lulu, I forgot to update the book cover to state that I’m now at UCSD. OK, that’s a minor thing, who cares.

Today I was preparing for my diffy qs class and found an error in the notes. Actually I think I spotted this error in the spring when I taught at UIUC, but somehow forgot to fix it. So I find the first significant typo in the notes the day after posting new versions of the notes. And putting up new versions of the diffy qs notes is not as trivial as it may seem. It takes about 2 hours just to build the HTML version. This is because I have tex4ht do all math as images. Doing some of the math using CSS is faster, but then you get different font for some equations (as some are done using images and some using CSS).

If jsmath actually worked right, I would use that. But since I generally have a hard time making jsmath display things correctly on my own system, I assume that it doesn’t work right for a lot of people. Also jsmath seems to break for me for the pages as large as the notes. Mathml would be the perfect solution if it would work properly and consistently on all browsers. Right now for mathml and tex4ht you have to first decide on a browser to support, which beats the whole idea.

Browser developers are generally interested in floating 3d fish rather than actually useful stuff like supporting mathml properly. One of the reasons I am sticking to firefox is because the fact that they do support mathml reasonably, and have for a while, is an indication of sanity on the part of the developers.

September 1, 2010

Adobe acrobat is possibly the worlds slowest software

Filed under: Technology — jlebl @ 7:07 pm

See title. It took about 5 minutes to open a pdf that evince opens in a second or two and that xpdf opens essentially instantly. I was ready to xkill acrobat. Unfortunately for whatever reason evince did not allow me to click checkboxes in a form so I needed acrobat.

Talking about slow applications, the cannon mp560 printer scanner is taking about the equivalent time to scan a single page now. But here I assume it may have to do with the fact that I am using it over wifi.

August 31, 2010

directed acyclic graphs suck for a vcs

Filed under: Hacking,Technology — jlebl @ 7:28 pm

DVCS systems like git work on a directed acyclic graph (DAG) model where branching happens automatically with just about any commit. Traditional vcs (e.g. CVS) works generally as a tree where branching must be explicitly done.

Now the argument for DVCS is that you can commit without merging differences done by someone else. The axiom seems to be

Axiom of DVCS: Merging should be done as late as possible.

What’s wrong with that? Well nothing if you are happy for the computer to just blindly merge two very divergent code bases without worrying about interactions of those changes (you kept hacking on feature A which required that function foo works in a certain way, while person working on feature B changed how foo works because he didn’t see anybody use it yet because your use of it was in a branch that the second person of course didn’t look at because he was busy working on feature B). No that never ever happens because all developers always talk to each other about every little change and because every internal function/method/object is completely documented. Yeah.

With traditional VCS, merging is required when checking in. The axiom changes to:

Axiom of traditional VCS: merging should happen as early as possible when divergence is as small as possible.

When everyone continually has to keep up with all the other changes that other people are working on, possibility of screwing up is smaller. Merging other people’s changes into your tree can be far simpler if it was just someone’s morning worth of work with say 100 lines of code. You can actually look at that quickly to review what happened. Not to mention you will have to look at it if there are direct conflicts with your work. Furthermore, if the other person changed how function foo works, you will notice sooner rather than later so you can resolve the conflict before both of you go too far assuming function foo works in a certain way.

I know exactly why DVCS is more popular nowadays. Firstly it is new and new things are always better even if they are worse. Second, it is more complicated, and complicated things must always be better. But most importantly of all: DVCS has a lot more buzzwords. It is distributed, it uses directed acyclic graphs (finally you have a use for some of your CS classes). Lots of things that work are replaced continuously by complicated things that don’t.

Example: I would say the level of the desktop software (Windows, Linux, and Mac) has not improved substantially. It has changed yes. It does lots of graphical voodoo. It allows you to do things that nobody ever wants to do. But if you took a basic desktop from the year 2000 and you simply fixed it to work right, you’d have a much faster, much more productive environment. But fixing things is not as much fun as rewriting a desktop on OpenGL, making windows wiggle, and making a different funky widget set for each application.

July 14, 2010

Microtypography

Filed under: LaTeX,Mathematics,Teaching,Technology — jlebl @ 5:38 pm

I have been playing around with the microtype package for PdfLaTeX. The results are really nice. Using the font expansion does increase the size of pdf a tiny bit, but not much. It is definitely worth it I think. Overall using the microtype package, I seem to be getting better line breaks, especially in tight places where there are floating figures (where text flows around them). To use simply add
\usepackage{microtype}
to your file, and make sure to use pdflatex rather than latex and dvipdf.

What it does is two things. Firstly it will add protruding punctuation (say periods actually hanging off sides of your paragraphs) to make a more straight looking justification. Furthermore, it may “stretch” the font by a tiny bit on certain lines to get a more even “greyness” of the text (for example, getting more uniform inter-word spacing). It also gives the justification algorithm more freedom in finding better line-breaking points, so you generally get better line-breaking (less hyphenation, etc…). It is the font stretching that adds a bit to your files since you need more copies of the font in the file, but the size increase is not terribly big on large files in relative terms. Still with microtype and PDF1.5, the 2MB differential equations pdf goes down by about 200k compared to no microtype and PDF1.4.

I want to do a bit more cleaning up and perhaps some more fixes before I post updates to the Notes on Diffy Qs, Basic Analysis, and the SCV minicourse. Probably within a few days.

Speaking of the notes, it is interesting that the real analysis notes are now downloaded more frequently by new unique IPs than the differential equations. On average about 30%-40% more. That is surprising, I would have thought that real analysis (taken almost exclusively by math majors) would be less interesting to “the masses,” rather than differential equations on the level of calculus (taken by almost any technical major).

« Previous PageNext Page »

Create a free website or blog at WordPress.com.