I'm too lazy to update the art section on this site, and online hosting services provide much better album interfaces. So anyways, here's a link to some more recent work: my imgur page.
I recently finished up two 10 lb. bags of Costco flour (Bob's Red Mill Organic unbleached) with great success and high repeatability. I am now working through a 10 lb. bag of Gold Medal bleached white flour, and it seems to require substantially more water (about 50% more) in the mix to obtain the same dough consistency I have been expecting. So now it's about 3 cups flour to 1.5 cups water, which more or less matches some of the other recipes I found online a while back. It's pretty remarkable how the brand of flour actually makes a huge difference on the recipe. It feels like Bob's Red Mill flour is more finely milled (or maybe less finely?) because it didn't clump up quite as much as big box name brand flour. After adjusting to equalize the initial dough consistency, the end result is roughly the same.
3 cups flour, 1.5 tsp salt, 4 tbsp starter, 1.25 cups warm water. 10 hour rise, double fold, 45 min warmed rise, 450 for 30 min with water (slit after 5 min), 15 min at 400.
This is finally a qualified success. First rise was heated at end above slightly heated oven. Second rise was in latently heated oven, and spread outward. 450/400 seems to be the right temperature. Second rise needs to be done on a surface different from the final baking surface, otherwise the bread is very stuck to the pan.
3+ cups flour, 1.5 cups water, 4+ tablespoons starter, 1.5 tsp salt. 15 hour rise. Double fold and 2 hour rise. Oiled and slit, then baked at 475 for 30 min, 450 for 15 min.
During the second rise, it rose outward instead of upward again, possibly because I forgot to mix the salt thoroughly into the flour before adding wet ingredients. Apparently 475 is still too hot.
3 cups flour, 1.5 cups water, 4 tablespoons starter, 1.5 tsp salt (it was really hard to incorporate all the flour). 12 hour rise, double fold and 1 hour rest. Bake at 500 with water pan for 7 min, slit top, another 13 min (noticed it was browning too much) and reduced to 450 for 8 min, reduced to 400.
Well, that certainly rose higher than expected. Inside still looks wet.
0.5 tsp salt, 3 cups flour, 4 real tablespoons culture, 1.8 cups water. 12 hour rise. Patted down and folded twice, covered rise for 50 min over warm area. Back at 400 for 10 min with water pan, oil misted and slit top and baked 20 min more. Removed water pan and baked 10 more minutes. Still not rising enough, still too much water; lesson not learned.
3 cups flour, 4 real tablespoons culture, 1.5 cups water + 1tsp, 1.5 tsp salt. 16 hour rise. Patted down and folded twice. Backed at 400 with water pan for 5 min. Oil misted and slit top and backed 20 minutes more. Removed water pan. Baked another 20 min. The overall shape is more correct now, rising up instead of out. Still not browning enough. Need to try more steam, although it looks wet.
I probably added too much water, since the dough was a bit too fluid this morning. Had to separate it out into two separate lumps for baking. Set to 475 for 10 minutes, reduced to 425 for another 20 minutes. Forgot to slit the tops before putting them in the oven, and it shows by the excessive rise height after 10 min for one of them. Reduced to 400 for another 10 min. Still not very brown, turned off oven and removed water pan, but left loaves inside for another 5 min.
Result: very crunchy on the outside, but only slightly brown. Inside is still too wet and doughy, but definitely edible. Next time definitely use less water in dough, and remove water pan at the 20 minute mark.
Also, fed the culture another 0.25 cup water and 0.25 flour. Smelled very alcoholy; this might be bad.
Today, I am trying my hand at making bread and growing a yeast culture. I will post randomly in the future about ingredient parameters and results.
Culture: 1 packet expired Fleischmann's active dry yeast, half cup flour, half cup water, 1 tsp sugar. Sitting on top of fileserver.
Proof: 3 cups flour, 4 heaping kitchen spoons of culture, 1.5 tsp salt, 1.5+epsilon cups water, mixed to watery dough consistency. Sitting atop desktop over CD drive. After extracting from culture, about 1.25 cups remaining.
Recently, a new scientific computing language Julia has been released (or rather, received much attention). I attended a talk by one of its creators last week and I am less than impressed. Granted, it is still very much in its infancy, but it seems more like a programming language research project than a language created to solve problems, even though that's what it is being sold as. The talk largely focused on the interesting aspects of the language itself (in fairness, I don't see how else one would fill in an hour timeslot).
My primary problem with it is that it appears to be a variant of Matlab, while there are already lots of Matlab clones out there like Octave. The problem with Matlab is that it is such an unorganized language, with lots of annoyances that make actual programming extremely painful. For example, the random hodge-podge of short built-in functions in the global namespace, the 1-based indexing (yes, this is actually a hassle when you want to implement complicated algorithms), the lack of concrete typing.
There are a few positives. Its automatic parallelization is is neat, but its ability to scale up automatically remains to be seen. The fact that its entirely open source is also a huge plus.
The closest to an ideal language at the moment is Lua, except for its 1-based indexing. A more C-like syntax and array slices would be nice, too. Here is my ideal scientific computing language checklist:
- C-like syntax. Seriously. It is so well established that most people can read it. Anything other than curly braces, to me, just makes identifying blocks harder than it needs to be. I might be biased on this one.
- 0-based indexing (ranges are lower inclusive and upper exclusive). Mathematicians, shut up. When you try to obtain sub-blocks of matrices, it is far more natural to think in terms of offsets rather than ordinal positions. Reading Matlab and Fortran code that tries to do this is extremely confusing with the off-by-1 corrections in the slices. This phenomenon extends to areas far beyond matrix slicing.
- Multiple return values from functions. Wait hold on. Let's start with actual fuckin' functions (I'm looking at you, Matlab). While Matlab actually does allow multiple and contextual function return values, the fact that it forces you to put each function in its own file is absurd. All new languages that don't allow multiple return values: why? This is 2012. We can build this. We have the technology.
- Sensible function names. "eig" is NOT a good function name. Hm, let me compute an eigenvalue and store it in a variable. What shall I call it? How about "eig". Well fuck. Mathematica has the right idea here. The function names are borderline-too-long, but they are clear, descriptive, and unlikely to clash. The Mathematica module system is also quite sensible, with namespaces for functions and variables.
- Compilable to clean C-code. This is a feature that basically no language in existence supports. For scientific computing, the final stage in prototyping is producing production-ready code. Which means standards compliant C89 code, period. Anything more sophisticated risks not being supported on the target platform. This feature pretty much requires some kind of type declaration or hinting system in the language. I'm okay with this being required, but practically, it is better if this is required only for code generation.
Note that I don't require special matrix syntax, such as the single-quote transpose operator in Matlab, or the ungodly quantity of non-ASCII symbols in Mathematica. The language of mathematics will always be too rich and fluid for a programming language to keep up with, especially one that hopes to maintain support and compatibility. Until this holy-grail of a language comes around, I will stick to C and Lua.
My geometric ruler-and-compass construction app is now online, having been ported from C++. I emailed David Goines last week to see if I could put his letterform constructions online, and I was ecstatic when he responded in the affirmative. Clearly the app is still a bit rough around the edges and lacks proper documentation, but that will be forthcoming.
The lack of good, simple C code to accomplish relatively basic numerical tasks is, quite simply, shocking. I am a numerical prototyper that uses C, and it is usually painful to see packages written in higher level languages that I cannot use. So here is a wishlist.
- Single file C library for linear programs (lp_tiny is a terrible solution)
- Single file C library for quadratic programs (QuadProg++ is a partial solution)
- Single file C library for sparse LDL^T factorizations of symmetric indefinite matrices
- Single file C library for sparse LU factorizations of general square matrices
- Single file C library for out-of-core sparse LDL^T factorizations of symmetric indefinite matrices
- Single file C library for out-of-core sparse LU factorizations of general square matrices
- Simple C implementation of a boundary integral equation method in 2D (acoustic scattering)
- Lattice classification routine (into the 14 Bravais lattices) given three 3D lattice basis vectors.
- Simple C implementation of 2D (and 3D) periodic Delaunay triangulation (CGAL is anything but simple).
- A C callable library for convex optimization (a port of CVXOPT would do)
- A C++ rewrite of LAPACK without template metaprogramming (RNP is a tiny step in this direction)
- A variant of Lua with 0-based indexing