Tuesday, August 11, 2020

Go: Convert errors.Wrap calls to fmt.Errorf

 I was a longtime fan of https://github.com/pkg/errors. It was a great way to add context to why an error was being returned which made tracing them easier. The need for pkg/errors has gone away with the new fmt.Errorf %w directive, errors.Is(), and errors.As().

I used errors.Wrap() a lot so naturally my code has lots of function calls I need to migrate. One repo had close to 300 calls to errors.Wrap() which is more than I'm willing to do by hand. I wrote a simple tool to take care of the most common case I have: errors.Wrap(err, "<message>").

  • Any line it doesn't know how to handle it leaves unchanged.
  • It looks specifically for errors.Wrap(err, "
  • On that same line it expects to find a double quote followed by a closing paren
  • The existing context string has : %w appended to it
  • It does not edit your imports; you should run goimports or a similar tool
  • By default the tool just outputs to stdout; use -o to overwrite the file in-place
  • Fix everything by doing for i in $(grep -R errors.Wrap `ls`); do errors_wrap_convert -in $i -o; end
  • Definitely make sure you have a snapshot of your code to revert back to in case this tool does bad things
  • This could have been done better using gofix but I was in too much of a hurry to learn how to extend gofix.
# errors_wrap_convert.go
package main

import (
        "bufio"
        "bytes"
        "flag"
        "fmt"
        "io"
        "io/ioutil"
        "log"
        "os"
        "strings"
)

var (
        fIn        = flag.String("in", "", "input file")
        fOverwrite = flag.Bool("o", false, "overwrite the existing file")
)

func fatalIfError(err error, msg string) {
        if err != nil {
                log.Fatal("error ", msg, ": ", err)
        }
}

func main() {
        flag.Parse()
        b, err := ioutil.ReadFile(*fIn)
        fatalIfError(err, "reading input file")

        var out io.WriteCloser = os.Stdout
        if *fOverwrite {
                out, err = os.Create(*fIn)
                fatalIfError(err, "opening output file")
        }
        defer out.Close()

        scanner := bufio.NewScanner(bytes.NewBuffer(b))
        for scanner.Scan() {
                fmt.Fprintln(out, Rewrite(scanner.Text()))
        }
        fatalIfError(scanner.Err(), "scanner error")
}


func Rewrite(in string) string {
        idx := strings.Index(in, `errors.Wrap(err, "`)
        if idx == -1 {
                return in
        }

        eIdx := strings.Index(in[idx:], ")")
        if eIdx == -1 {
                return in
        }
        eIdx += idx

        q1Idx := strings.Index(in[idx:], `"`)
        if q1Idx == -1 {
                return in
        }
        q1Idx += idx

        q2Idx := eIdx - 1
        if in[q2Idx] != '"' {
                return in
        }

        out := in[:idx] +
                `fmt.Errorf(` +
                in[q1Idx:q2Idx] +
                `: %w", err)` +
                in[eIdx+1:]
        return out
}

And a couple of basic tests:

# errors_convert_test.go
package main

import "testing"

func TestRewrite(t *testing.T) {
        t.Parallel()
        for in, want := range map[string]string{
                "": "",
                `               return nil, errors.Wrap(err, "bad thing") // foo bar`: `                return nil, fmt.Errorf("bad thing: %w", err) // foo bar`,
                `return nil, errors.Wrap(err, "foo " + blarg + " bar")`: `return nil, fmt.Errorf("foo " + blarg + " bar: %w", err)`,
        } {
                got := Rewrite(in)
                if got != want {
                        t.Fatalf("got %q, want %q, for %q", got, want, in)
                }
        }
}

I had searched for a tool to do this but it either doesn't exist or my searching ability failed me. If you would like to pick this up and generalize I'd happily refer to your version as canonical.

Sunday, August 2, 2020

The Autobucket Saga

The Leaking A/C and Early Failure

A few years ago our air conditioning started leaking. We discovered this when a stream of water started running from the corner of our kitchen's ballast lighting. Naturally we were alarmed. We found the location of the leak and got a plastic tub underneath it to catch the water. Until we could get a technician to the house we got to choose why we weren't sleeping well; either because it was too hot with the A/C off or every couple of hours one of us had to empty the tub with a wet vac.

This problem annoyed me. I'm a smart, technical guy, I should be able to solve this. I had taught myself some electronics and should be able to programmatically control a pump. I bought a little 5v USB pump and some float switches. I hooked it all up to a raspberry pi with the pump's power being controlled by the pi via an NPN transistor. Pump turns on when the high-water mark switch closes. Pump turns off when the low-water mark switch opens. Super simple, and for the life of me I couldn't get it to actually work.

The Student Elevates Himself

Since then I've learned a lot more about electronics, though I'm still a newbie. This year at BSides San Diego I bought an arduino-compatible microcontroller board and some other components as a way to help fund the event. Since I bought them I had to experiment with them!

The basic stuff is pretty easy! In the course of fiddling and experimenting I realized the problem with my original setup; with neither the pump control transistor nor the float switches was I tying them to a ground reference (via resistor, of course).

I spoke about my new understanding and new confidence to my loving partner. She noted how the condensation from the A/C just gets pumped out down the side of the house. We catch it with a bucket but rarely think to dump that water on our orange trees. It should would be nice to have the water moved over there automatically! I was inspired.

The Autobucket Is Born

This came together on a breadboard pretty quickly.

Random USB wires to your gaming laptop is fine, right?

Each float switch goes to a GPIO pin on a raspi zero w with the other side having a 10k resistor to ground and a 1k resistor to the pi's 3v3. I elected to have discrete pulldown resistors rather than integrated pulldown/pullup purely because I understand it better. The pump was directly connected to USB 5v with ground going to the NPN's collector. The base has 10k to ground and 1k to a GPIO pin. I wrote all the software in go with gobot including a feature to notify me via Telegram as the pump cycles on and off.

Amazingly, it just worked! On my bench I could move the float switches and see the program change state. Rather than powering on a dry pump I plugged in a device that charges via USB with a charging indicator LED. In the appropriate states it would turn on and off. Awesome!

Next up, the bucket. We have a bunch of orange buckets so hacking one up wasn't an issue:

Water, wires: besties!

I installed a couple of holes at different heights and installed the float switches. With their rubber gaskets it didn't even leak! I dropped the pump in there and for the time being just accepted that it didn't sit fully at the bottom but sufficiently below the low-water mark.

I didn't want the control system exposed to rain and sunlight and I needed it to be near power. I have a covered patio nearby with AC outlets, I just needed to establish connectivity between the two points. I needed 5 wires: USB+, USB ground, 3v3+, float switch 1, float switch 2. I have a supply of CAT-5 which is great since it even provides easy to differentiate individual wires. I wanted to be able to disconnect it so two wires went to a female USB connector and three into a molex hard drive power connector from my parts box. With this in place I could connect and disconnect as needed. Once things were settled I could shrink-wrap the connections for some weather protection.

I'm playing outside!

Testing again with the circuit on the board and again, success! You may note in this photo that power is supplied by exposed USB wires and gator clips. This is fine for a day of testing but not a workable long-term solution.

I'd like a neater board but don't keep perfboard handy. I do, however, have a 3D printer, a decent ability with CAD, and general lack of good judgement.


Sorry about your gag reflex.

Since I'm already using CAT-5 to connect to the bucket, why not RJ45? I had some RJ45 keystone jacks so I super glued a couple to my board. One was intended to connect to the bucket, the other to go to the raspi. Instead I ended up connecting to the raspi via a female header connector snipped in half then glued together so that I could easily connect and disconnect to GPIO. While I was at it, I hooked a BME280 sensor via I2C so I'd have an outdoor temperature/pressure/humidity sensor that I can expose as a webserver.

For power I grabbed a phone cable I had with only two wires. Part of the way through the conversion I had something that at least made me chuckle:

Windows is configuring your new magic smoke

This will need an enclosure but for the next step I started with a semi-disposable plastic storage container.

Pioneering Avant Garde project enclosures

And much to my surprise, it's still working at this stage. Before I go for a more permanent enclosure I want to let it run for a few days and make sure it doesn't need any changes.

Trouble In the Garden of E-Dumb

It does work, mostly. The pump is tiny and weak which is to be expected. I don't really care how fast the water makes its way to the orange trees, just that it gets there. Eventually though the pump is being started but not shut off. The pump is on, no water is flowing. It has to push the water through 1/4" inner diameter vinyl tubing up the height of the bucket, then a few yards over to the tree. I fluff the tubing and the water starts flowing. My hope at this point is that I'd only have to prime it after it's been idle for a while. The next step up in pump power is 12v and I'm reluctant to go there.

This problem persists. If I prime it, it gets going otherwise not much is happening. At first I thought maybe the primary purpose the pump is serving here is to get the initial water over the bucket height and then siphoning is taking care of the rest. It's not reliably doing even that though.

It eventually occurs to me that I can help the siphoning action by elevating the bucket. The path along the ground from the bucket to the tree is all flat. If the water source is elevated higher than the destination the siphoning should be more effective. I'm about to arrive...

The Autobucket: Passive Edition

I eventually realize I'm being pretty stupid. I've realized that gravity is doing most of the work here. I can ensure that the water reservoir is higher than the outlet.

I don't need the pump, the sensors, or the control at all. I need a hole near the bottom of the bucket and to seal the tube in that hole. Gravity will cause the water to drain through the tube. I had fixated on a solution to the complete neglect of the objective.

In summary:
  • What I built: A network-connected gray water reclamation and irrigation system
  • What I needed: A bucket with a hose glued into it

Sometimes the dumb solution is the right solution

Reflections

I basically took three lefts instead of a right but the journey wasn't all for naught. I got validation that I had overcome my gaps in electronics knowledge since the leaky air conditioner.

One thing that went very well was my process. In past projects I've had frustrating failures by pushing through to a complete solution. When something didn't work I had gone so far that troubleshooting meant tearing down and starting over. This time I worked much more incrementally, validating my progress at each stage and having the chance to make corrections. While I rarely needed corrections along the way, the anxiety that I might have screwed something up and wasted all my time was minimal.

I diagrammed lots of things. I put things together on the breadboard to make it work then translated that to a circuit diagram that I could follow more easily. When wiring up the CAT-5 I wrote down which colors would do what before I made any connections. From then on I could easily know which was the correct wire. I could then maintain that color scheme for portions downstream from the cable to keep it consistent and easier to wrap my head around. I haven't built up my electronics chops yet to keep what each wire does in my head.

Wire type really matters. I've often used internal CAT-5 strands as plentiful solid-core wire with color-coding. When I put that on the female pin header it didn't flex for anything and was very hard to work with. I had some stranded but only three colors, which complicated things. I ordered an assorted color stranded wire kit and for subsequent projects and it's been great.

The weather sensor was a great addition. I loved being able to check it from my phone. I ended up scrapping the raspi setup and setting up just weather sensors with the BME280, an Adafruit ESP8266 board, and some software that made the data available in Prometheus format.

Chibi Weather Station

Ultimately we don't learn a whole lot from our successes; we learn much more reflecting on failure. Maybe my missteps can be useful to you.

Thursday, February 27, 2020

Priority Channel in Go

I'm kind of impressed with this ugly monster:

package main

import (
 "context"
 "fmt"
 "sync"
 "time"
)

func main() {
 const levels = 3
 const sourceDepth = 5
 sources := make([]chan int, levels)
 for i := 0; i < levels; i++ {
  sources[i] = make(chan int, sourceDepth)
 }
 out := make(chan int)

 ctx, cancel := context.WithCancel(context.Background())

 wg := &sync.WaitGroup{}
 pc := New(ctx, sources, 10, out)
 wg.Add(1)
 go func() {
  defer wg.Done()
  defer close(out)
  pc.Prioritize()
 }()

 wg.Add(1)
 go func() {
  defer wg.Done()
  for i := range out {
   fmt.Println("i: ", i)
   time.Sleep(time.Second / 4)
  }
 }()

 for _, i := range []int{0, 2, 1, 0, 2, 1, 0, 2, 1} {
  fmt.Println("submitting ", i)
  pc.Submit(i, i)
 }
 time.Sleep(time.Second * 3)
 cancel()
 wg.Wait()
}

type PriorityChannel struct {
 notify  chan struct{}
 sources []chan int
 out     chan int
 ctx     context.Context
}

func New(ctx context.Context, sources []chan int, cap int, out chan int) PriorityChannel {
 pc := PriorityChannel{
  notify:  make(chan struct{}, cap),
  sources: sources,
  out:     out,
  ctx:     ctx,
 }
 for i := 0; i < cap; i++ {
  pc.notify <- struct{}{}
 }
 return pc
}

func (pc PriorityChannel) Prioritize() {
 for {
  // block until there's a value
  select {
  case pc.notify <- struct{}{}:
   // proceed
  case <-pc.ctx.Done():
   return
  }
 SOURCES:
  for _, rcv := range pc.sources {
   select {
   case i := <-rcv:
    pc.out <- i
    break SOURCES
   default:
    // keep looping
   }
  }
 }
}

func (pc PriorityChannel) Submit(i, priority int) {
 if priority < 0 || priority >= len(pc.sources) {
  panic("invalid priority")
 }
 pc.sources[priority] <- i
 <-pc.notify
}

Monday, January 13, 2020

The Tool Concert: A Synopsis

Drummer: <Bonk uh dunk
Bonk uh dunk
Bonk uh dunk tsh
Bonk uh dunk
Bonk uh dunk
Bonk uh dunk tsh>

Lead Guitar: <Grong gugga gug
Gug grong gugga gug
Gug grong gugga gug>

Bass Guitar: <Do doon doon do>
(no one knows what a bassist is doing)

Singer: I can't express the pain of being intellectually superior to everyone

Background Visuals: <Terence McKenna and David Cronenberg are fighting for control of the Winamp visualization plugins>

My conclusion: Tool is an alternate reality version of Phish where they dedicated themselves to rebelling against yuppies and neocons.


In all seriousness though, it was a really great show, and I'm not really a Tool fan. For the songs I was familiar with they were exactly as you hear on the radio; Rush level precision.

The opening act was awful and I won't dignify the name. Tool played a two and a half hour set which included a fifteen minute intermission. Also, I've never seen a crowd so engaged.

I was surprised by how much I liked the show. Not my favorite style of music but they really did earn my respect. 

Friday, November 29, 2019

It's Possible To Not Feel Like Garbage

I commonly see a type of post on social media. In this post, the person says something to the effect of "You matter" or "You are loved". The intent is to bolster the spirits of people who feel hopeless. It's well-meaning but in my opinion is useless at best and counterproductive at worst.

When you have depression part of your brain is dedicated to crushing your spirit. It knows all about you; all your doubts, fears, and regrets which it will use to bring down your sense of self. It is always with you and always working against you.

Your own self tells you that you are worthless and unlovable. So when a stranger says you have value and that you are loved it doesn't come across as a message of hope. At best it's a message of ignorance and at worst it's patronizing. "You don't know the first thing about me" is obvious and true. A perfect stranger telling you a fact about yourself is pretty hard to swallow.

A better message is "You don't have to feel like garbage". Depression makes you believe that feeling worthless is simply natural to you. It sounds silly but the idea that you can feel simply okay is a genuine message of hope. True happiness might be unrealistic but "not garbage" is something people with depression do experience on occasion. It's plausible that this is a normal state and might be achievable.

When trying to reach out, keep in mind that your positive messages may be hard to believe. People that may need to hear you won't always listen or be ready to understand. Be patient, be open-minded, and accept that their problems are unique to them and will require solutions unique to them that you may not have access to.

Friday, June 14, 2019

The Scenic Route To Go Interfaces

Go is an awesome language and interfaces are one of its most powerful features. They allow for decoupling pieces of code cleanly to help make components like database implementations interchangeable. They're the primary mechanism for dependency injection without requiring a DI framework.

Newcomers are often mystified by them but I think they're less confusing if you get to them via the scenic route. Let's look at creating our own types in Go. Along the way we'll find parallels that help make interfaces more clear.

Sidebar: Java Interfaces

If you're not experienced with Java, move on to the next section. Nothing to see here.

If you're experienced with Java, Go interfaces will be pretty familiar and comfortable. The key difference is that a class in Java must explicitly implement a predefined interface. In Go, any type that has the proper method signatures implements an interface, even interfaces created after the type. In Go, implementing an interface is implicit.

Custom Primitive Types

Go emphasizes simple, clear types. You can define your own to help model your problem space. Here I might want to capture a set of boolean flags in one variable:

type BitFlags int32
  • I'm defining my own type
  • I'm giving it the name BitFlags
  • It represents an int32
Why not just use int32 if that's what I want?

One reason is methods. I can attach methods to a type I've defined to give my type additional behavior. Perhaps I define a bunch of constants to represent individual flags and I provide methods like IsSet(BitFlag) bool and Set(BitFlag).

Another reason is explicit type conversion. In other languages it's valid to assign a 64 bit integer variable to a 32 bit integer variable. They're both integers so it's logical to do so. However, you're possibly losing the high 32 bits of the source value. There's an implicit type conversion happening that is often silent and often surprising.

Go doesn't allow implicit type conversions:

i32 := int32(17)
var bf BitFlags
bf = i32 // not allowed
bf = BitFlags(i32) // just fine

This is done to eliminate surprises. The compiler isn't silently setting a type conversion that can change your data without your knowledge. It requires that you state that you want the conversion. This makes it harder for users of the BitFlags type to accidentally provide a numeric value that shouldn't be interpreted as flags.

Custom Struct Types

type Foo struct {
A string
B int
}
  • I'm defining my own type
  • I'm giving it the name Foo
  • It contains the following data
Structs allow you to bundle pieces of data together into a single item. That item can be passed around as a unit. Like custom primitive types, custom struct types can have methods attached.

Also like custom primitive types you can assign one to the other if they are equivalent using an explicit type cast:

type Foo struct {
A string
B int
}

type Bar struct {
A string
B int
}

func main() {
f := Foo{A: "foo", B: 3}
var b Bar
b = f // invalid
b = Bar(f) // just fine
}

Custom Interface Types

An interface type specifies requirements for behavior. Methods are behavior which is why we tend to name them with verbs or action phrases. In go, any type that has those exact method signatures satisfies the interface's requirements.

type Storage interface {
Create(key string, o Object) error
Read(key string) (Object, error)
Update(key string, o Object) error
Delete(key string) error
}

  • I'm defining my own type
  • I'm giving it the name Storage
  • Anything with these methods qualifies as this type

Using interfaces I can define requirements for a storage system for my application to use. My application needs something through which I can create, read, update, and delete objects associated with a given key.

func GeneratePDFReport(output io.Writer, storage Storage) error {
// ...
}

My application isn't concerned with how those operations are actually performed. The underlying storage could be an SQL database, S3 bucket, local files, Mongo, Redis, or anything that can be adapted to do those four things. Perhaps the report generator supports many storage mechanisms and when the application starts it decides which storage to use based on a config file or flags. It also means that when I need to write tests for my report generator I don't need to have an actual SQL database or write files to disk; I can create an implementation of Storage that only works with test data and behaves in an entirely predictable way.

Interface nil and Type Assertions

For all variables of interface types the runtime keeps track of two things: the underlying value and that value's type. This leads to two different ways an interface variable can be nil. First, the interface value itself can be nil. In this case there's no type information, no underlying value; nothing to talk about. This is very common with the error interface. In the case of no error the interface variable itself is nil because there's no error to be communicated.

In the second case there's type information but the underlying value is nil. A comparison like myInt == nil returns false because the interface value exists and points to type information. Ideally in this case nil is useful for that type as in the final example in Dave Cheney's zero post.

If needed you can get at the underlying value inside an interface variable.

io.Writer is a commonly-used interface. It has only one method: Write([]byte) (int, error). If I have a variable out of type io.Writer the only operation I can perform on it is Write. What if I want to Close it? Ideally if you have Close as an requirement you should make Close part of the interface type of your variable (or use io.WriteCloser instead of io.Writer).

For the purposes of illustration you can do a type assertion. This asks that the runtime verify that the underlying thing in your variable is of a certain type:

if c, ok := out.(io.WriteCloser); ok {
err := c.Close()
// handle error
} else {
// not an io.WriteCloser!
}

In the above example, if out happens to be an io.WriteCloser then ok will be true and c will be out as type io.WriteCloser. If out doesn't happen to be an io.WriteCloser, ok is false and c is zero for io.WriteCloser which is nil.

Anonymous Struct Types

Given a preexisting struct type I can create a value of that with data in one statement:

f := Foo{
A: "foo",
B: 17,
}
  • I'm creating a variable f
  • It is of type Foo
  • It contains these values

In the above struct examples each of the types I defined have a name; this isn't always necessary providing I'm creating the struct on the spot and assigning it somewhere.

ff := struct {
A string
B int
}{
A: "foo",
B: 17,
}



  • I'm creating a variable ff

  • It is of this type

  • It contains these values

  • Like the named struct types above I can do an assignment with an explicit type conversion:

    f = ff // invalid
    f = Foo(ff) // totally fine

    This sort of anonymous struct type is common in table driven tests. It's also not uncommon in defining nested structs as mholt's JSON to Go converter does.

    You'll also sometimes see this:

    stringSet := map[string]struct{}{}



  • I'm creating a variable stringSet

  • It is of this type

  • It contains these values

  • The last part looks a little strange. It's a map with strings for keys but what are the values? The values are empty structs: they contain nothing and therefore take up no memory. What good is that? It's a map that only tracks the presence of keys which functions as a logical set. The final magenta curly braces define the initial contents of the map; it's empty.

    Anonymous Interfaces

    Just like you can have anonymous struct types you can have anonymous interface types. The following are equivalent:

    var foo io.Reader

    var foo interface{
    Read([]byte) (int, error)
    }

    In either case I can assign anything with a Read([]byte) (int, error) method to foo.

    We're near the end of our journey which brings us to the enigmatic interface{}:

    foo := interface{}{}
    var foo interface{} = nil
    • I'm defining a variable
    • I'm giving it the name foo
    • Anything with these methods qualifies as this type
    • The contents are explicitly zero
    interface{} is an anonymous interface type. It has no requirements so any value is suitable. I can pass around a value of type interface{} but I can't do anything with it without using a type assertion or the reflect package.

    In this way the empty interface is different than other interface types in that it doesn't specify required behavior. It sidesteps the type system and turns what could be compile-time errors into run-time errors. When writing code that uses the empty interface use great care.

    Saturday, June 8, 2019

    Wednesday, March 6, 2019

    3D Printer Filament Cheap Vacuum Desiccation

    Lots of 3D printer filaments are hygroscopic; they absorb water from the atmosphere. This is problematic for filaments because during printing they are subject to a sudden, dramatic jump in temperature. This causes the absorbed water to explode potentially making detrimental changes to the material properties of the filament. For filaments like PETG and Nylon this makes for a rougher surface and anomalies in the finished part.

    The standard mechanism for drying filament is to put it in the oven for a while. I started getting strange results with my PETG because I got lazy and left it out between prints. I wanted to fix the issue but wasn't keen about putting plastics in my oven. I wondered about other methods and the silica gel packets came to mind. Those are the little packets that come with shoes and other things and say DO NOT EAT. I was skeptical about the power of these to reverse the absorption that had already taken place, at least not in a timely fashion. Also, I like to tinker and wanted to experiment with weird ideas.

    I like to sous vide and had recently gotten a food vacuum sealer with an attachment for sucking the air out of wine bottles. I could seal the filament spools in a vacuum bag with some desiccant but I felt like it would work better if I could get closer to a proper vacuum. I didn't want to invest in a high-grade vacuum chamber and pump for an experiment. I started looking around and found the FoodSaver Quick Marinator; a rigid container with a valve for the hose that connected the vacuum sealer to attachments.

    Here's my setup:

    • Off-brand food vacuum sealer
    • The FoodSaver Quick Marinator
    • The vacuum sealer attachment hose
    • Coffee Filters
    • Flower desiccant powder
    The flower desiccant powder is the same material as the silica gel packets you can get but it's very fine, providing a higher surface area, and they're trivial to renew.

    I put 5 (arbitrary) tablespoons of desiccant powder into a coffee filter, fold it up, and tape it closed so the powder can't easily escape.

    I put my filament spool into the marinator, drop my desiccant pack on top and put the lid on the with tabs on the same side.

    Then, suck the air out. I run the pump until it shuts off. Wait a few seconds, and repeat. I go through about three cycles of this. After the pump has shut off, ensure the pump isn't warm; you don't want to ruin your pump by overheating it.

    Finally I ensure the valve is closed before removing the hose.

    Surprisingly, this worked. My next round of PETG printing produced a nice, smooth finish. I've taken to storing the spools with one of my homemade desiccant packets in a small Space Saver bag and vacuum the air out. It doesn't look like they sell just the small bags and the medium and larger are grossly too large for a filament spool.

    It's important to note that so far I've only done this desiccating process on Dremel filament spools which are 0.5kg. As you can see in the picture above these spools just fit in the quick marinator. Larger spools probably won't fit, loose filament can depending on how much you want to stuff in there. When I get larger spools I'll probably have to come up with another mechanism.

    Thursday, January 10, 2019

    New 3D Printer: Dremel 3D45

    Background

    For a while I had a 3D printer. Specifically a Printrbot Simple Metal. When it worked, it was a lot of fun. I could start with an idea and turn it into a real physical object without the need for a big workshop, huge mess, or personal injury. When it worked, it was great! But the process of getting it to work and keeping it working took all the joy out of the prospect of desktop fabrication.

    Three years later I found myself thinking about getting another 3D printer, hoping that the reliability had improved. I started looking at what was on the market with an emphasis on reliability: I didn't want to tinker with the printer, I just wanted to turn designs into objects. Dremel seemed to have some reliable offerings though a little pricey. I pulled the trigger on the 3D45 and so far I'm very pleased.

    Disclaimer

    I should be getting compensated for this review. My printer came with a small flyer that said if I post a review of the printer on my blog I would receive two spools of PLA for free. The flyer didn't provide direction on what the review should say.

    Initial Impressions

    I pulled the printer out of the box and had it set up in about 20 minutes. I was able to start a job for a frog figure from the internal storage. It came out almost perfect with a couple of misaligned layers and some poor adhesion ("springing") at the top.

    Overall though I had good luck with the printer initially. I would sometimes have trouble with first layer adhesion which is the most common problem printing. Trying different things to correct it I found that washing the build plate, perform bed leveling, and reapplying glue would almost always correct the issue.

    The Good, the Bad, and the Ugly

    Pros

    It just works. I've used an entire spool of ECO-ABS with an overwhelmingly positive success to failure ratio. I've printed in PETG and Nylon as well with no failures there. Nylon is notoriously annoying to print with and it worked on my first try. Note; The ECO-ABS is a proprietary formulation of PLA.

    It looks nice. While I don't actually care how it looks, the fact that it's fully self-contained means that the cat, the dog, dust, Dorito crumbs, ejected shell casings, etc won't put my prints a risk.

    It has almost all the bells and whistles. As stated, it's fully-enclosed to help keep it work inside and reduce warping. It has a heated bed to help with the same. The build plate is tempered glass in an easy to remove and handle frame. I'm really pleased with how easy it is to work with the build plate. It's got a touch screen that provides helpful information and allows you to perform basic operations and start/pause/cancel prints from onboard storage or a USB drive. It prints "ECO-ABS", PLA, PETG, and Nylon. It has an integrated web cam, runout sensor, yadda-yadda-yadda.

    The filament seems really good. The color and performance are consistent throughout the spool. I don't really have problems with first-layer or intra-layer adhesion. The Dremel filament spools have an embedded RFID tag that the printer reads and use to automatically set the temperature for the extruder and bed.

    Cons

    It's expensive. The printer retails for $1800. I got mine for $1400 on Amazon black Friday. That's fine as a reliable device that lasts years is worth investing in. The filament, however, is also expensive. Common, non-funky filaments run between $15-30 per kg. Dremel filament runs $30-40 per .5kg. This is tempered somewhat by the fact that between the quality of the printer and their filament my prints succeed more so I waste less (save my own design mistakes).

    It's something of a walled garden. There's not a lot of room to tinker with the printer. I don't want to tinker with the printer... I don't want to have to. However, should it be necessary it would be nice. I can't complain too much as I made a conscious decision to sacrifice hackability for reliability.

    The network features are weak. You can manage and monitor print jobs through their cloud service but you can't do so over the LAN. The frame rate for the video in their cloud service is like 0.1fps. It's difficult to actually understand what's happening.

    WTF?

    The cloud slicer is garbage. You can upload an STL to the cloud service and have it do the slicing for you. It was easy! I never got a successful print from the cloud slider. Usually there would be a loss of inter-layer adhesion leaving me with a slinky vaguely in the shape of my print. Using the Dremel Cura desktop slicer works extremely well and when I upload the gcode it produces to the cloud service the results are great. Unfortunately they kind of tout the cloud service and using only the cloud service brought me nothing but failure. Luckily I have some printing experience and knew how to troubleshoot this.

    "Filament DRM". They strongly encourage you to print using only Dremel filament. It seems to be high quality and the RFID identification feature is definitely neat. However, the state that using non-Dremel filament will void your warranty. This seems insane to me. I get that they wouldn't want to support issues people have using janky filament. I would totally accept that if I use 3rd party filament and my house burns down they shouldn't be held liable. But to say that they are unwilling to support me for using less expensive filament leaves a bad taste.

    And the filament cost. I would feel a lot less annoyed about the quasi-restriction of using only Dremel filament if it weren't 2-4 times the cost of other filaments. It is good but the difference in cost is kind of crazy. To some extent they market this product line for education; schools can't afford this. Also, they should encourage other filament vendors to use the same RFID spec. It's a cool feature. Overall, they should make the filament significantly cheaper or penalize their customers less for using third-party filaments.

    I'm really disappointed that I can't really use the built-in camera decently. I get that they wouldn't want to stream 720p over the Internet to everyone; they aren't trying to be a video streaming service. There should be a way for me to watch it in full quality over my home network.

    Trouble Strikes!

    Eventually I had a serious issue that wash/level/glue wouldn't resolve. I wasn't getting a first layer. The extruder would travel around the surface of the bed and yell CLICK CLICK CLICK CLICK as though some gears were slipping, like something was stuck. Thinking I had a clog I looked in the manual for their clearing process. The troubleshooting matrix says that if have a clog to contact support. Simultaneously the manual has instructions for clearing a clog. I went through the clog clearing process and found that the extruder could push filament out fine. Attempting another print I got the same symptom. It seemed that the tip of the nozzle was flush with the bed and it was clogged because there was no place to extrude filament to.

    I call customer support but it was a Friday afternoon. Wonderfully, they have US-based support. Unfortunately for me they're in Central Time, I'm in Pacific Time and they were already closed. I used their email contact form and received an automated response saying I'd hear back from them in two business days. This being Friday I shouldn't hear back from them until Tuesday or Wednesday. On Monday I got a response with corrective instructions that fixed the issue in a few minutes. It sucks that I was "out of commission" for several days. It's awesome that I had access to people whose job it was to help me. Free input on 3D printing issues will vary far and wide on the nature of your problem and remediation.

    Conclusions

    I think my money on this printer was well-spent. I've been really enjoying it. I feel like I got the reliability I was looking for. The filament is really expensive which makes me hesitant to experiment.

    Saturday, March 3, 2018

    Impossible Burger

    I just ate an Impossible Burger at Fatburger. If you hadn't told me what I was eating I would have told you I was eating a good but not great beef hamburger. The fact that it was entirely plant-based is impressive. It was savory, the texture was well within expectations for a beef hamburger.

    Kelly tells me it nutritionally matches beef so it's not necessarily healthier. There's still the reduced environmental impact over beef. I could even see versions that compromise: including perhaps 10% beef for the extra flavor.

    It was expensive though. Our two single-patty burgers with fries and drink was $31. I'd like this to become more popular so the price can go down. I would happily pay a dollar extra per patty for this as an substitution option at fast food places.

    I'm really impressed by this as a first product and look forward to where this is going.

    Friday, January 5, 2018

    Black Mirror - S4E1: Roko's Basilisk

    The first episode of Black Mirror season 4 is quite good. It brought to mind Roko's Basilisk. Go read about the silly thing human brain's get up to. I'll be here when you get back.

    Tuesday, January 2, 2018

    FPS: Wolfenstein II: The New Colossus - More BJ For Frau Engel

    Steam Sale, Wolfenstein II. Yes.


    I can reasonably say I've been playing Wolfenstein as long as there's been Wolfenstein. I even played the MS-DOS port of the original 8-bit game a few times. I played Spear of Destiny, Blake Stone, etc. Wolfenstein: The New Order is almost flawless.

    The New Colossus is a solidly decent game. There's more of the unexpectedly good story and character development from the previous game. This is definitely not the fungible, faceless protagonist of previous entries in the series, or even most FPS in general. Your character has thoughts, emotions, demons, loves, etc. The supporting cast of characters are dynamic and lifelike and it's definitely worth getting through the game just to see how the events played out. The ending actually felt fulfilling; moreso than in The New Order in my opinion.

    About halfway through though the story jumps the shark in spectacular fashion. In the first half the protagonist is put at a significant disadvantage and you're made to wonder how it might be overcome. The way in which it is overcome is completely absurd and thankfully it is quickly put behind you. When you get to the shark jumping, groan your way through it and press on.

    Combat is mostly satisfying. You get the same optional dual-wield mechanic as in The New Order though in this instance it's rare that not dual-wielding is the way to go. The way you choose which weapons are in which hands is unnecessarily clumsy on PC; I suspect it's made for console controller. As you explore you can find hidden weapon upgrade kits that allow you choose from three upgrades per weapon that are not mutually exclusive. Each weapon has an upgrade that can be toggled. For example, the machinepistol upgrade makes your rounds incendiary while dramatically reducing the rate of fire. Incendiary MP is magical against the panzerhunden but much less effective against human targets. I found triple-shot shotgun in the left hand and single-shot assault rifle in the right to be a great combination allowing effective firepower against both close and far targets.

    In principle The New Colossus allows you to choose stealth over bravado the same way The New Order did. In practice you can stealthily take out the first 2-6 enemies in a stage before things go sideways, more or less regardless of your execution. If a guard finds the body of a fallen enemy they go into a heightened state, which is fantastic for gameplay. Unfortunately, you have no mechanism for moving or hiding those bodies, even given ample time. This can be quite frustrating. Doom (2016) also gave you no reliable means for stealth combat, or even space to conduct ambushes but in Doom the whole flow of the game is in high-velocity, run and gun combat. The New Colossus holds stealth combat out and then snatches it away, just when you think you've got it.

    Run and gun works well and as mentioned you're mostly forced down this path. That said, most enemies have no special tactical weaknesses (less armor on the back) so few fights benefit from flanking. The super soldiers are weak in the back but it's easy to bait them into charging past you for easy back shots. Overall most fights I ended up just finding the most advantageous ambush point and waiting for the enemies to come to me. And they did... I would really like an FPS developer add an "I see the bodies of my comrades piled in this door way so maybe I should take a different route" mechanic to enemy AI. Which is a shame because many of the combat spaces are huge with multiple tiers, multiple places for cover, resupply, etc. Usually though I couldn't be bothered and just explored the space at my leisure after dispatching the majority of my foes with the stupidest possible ambush. Eventually though I decided to not pick up heavy weapons, to not ambush as much, just to keep the combat more dynamic, even on a higher difficulty setting.

    After you've hit the game's shark-hurdling portion you get to choose a "contraption" that gives you a special ability. As the game progresses you get to pick up the other two. The difference between these choices basically boils down to, "to get into the next room do I crawl under, climb over, or smash through?". Regardless of your choice you still end up in the same room so the choice is somewhat illusory. It's mostly a question of what kind of entrance do you want to make to the party.

    Criticisms aside, it's quite a fun game and worth a playthrough. Grab it on sale and rock out!

    Monday, January 1, 2018

    Cards Against Humanity Enhancements

    One of my friend groups plays a fair amount of Cards Against Humanity. I've come up with some enhancements.
    1. Use a shoebox, game box top, or some other box for white card discard. It's much neater than discard piles.
    2. Use a hat or something similar to collect white card submissions. It genuinely speeds things up.
    3. Provide multiple stacks of white and black cards respectively for people to pull from. It makes it much easier for people to get their next cards and keeps the game moving.
    4. If the black card with three blanks is in your deck, pull it out permanently.
    5. When a black card with two blanks comes up, the judge draws a white card to fill in one of the blanks of their choosing. Everyone submitting two white cards slows the game down. Filling in one of the blanks with a random white card means the black cards with two blanks become unique every time they appear, rather than tedious.
    6. During the initial deal-in for each player, give them three extra white cards. Each player discards three before the game begins. This provides a funnier set of initial cards for everyone.
    7. Draw two, keep one. After each player submits their white card they must draw a replacement white card. Instead, each player should draw two white cards and discard one of the two they just drew, their choice. This reduces the accumulation of bad white cards.

    Monday, September 4, 2017

    Fighting Acute Depression

    My only reliable strategy for fighting acute depression:
    • Identify something simple (simple does not equal easy) that I don't want to do but will feel good having made progress on. Dishes is a good one for me. Laundry is another.
    • Identify the smallest thing that constitutes progress. Washing one dish, putting away one piece of clothing.
    • Focus on the fact that doing that smallest thing represents progress. If I get more done, great! If not, I made progress when feeling overwhelmed.
    • Go do that smallest thing. Chances are more will get done since starting is the hardest part. If not, I don't beat myself up. I made progress and the next progress will be easier.
    • It doesn't always work and that's okay too. When getting started is the hardest part, trying is a small amount of success which fuels further success.

    Friday, January 27, 2017

    Vault: Error checking seal status: ... Forbidden

    $ vault status
    Error checking seal status: Get https://vault.internal-domain:8200/v1/sys/seal-status: Forbidden
    There was this issue suggesting it was a problem with the storage backend. In my case it was having HTTP_PROXY set in the environment and the proxy won't allow the connection. unset HTTP_PROXY fixed the issue.

    Thursday, January 5, 2017

    Golang Install: Uanapproved caller. SecurityAgent may only be invoked by Apple software.

    Trying to install golang 1.7.4 on Mac OS X I got the following error:

    Uanapproved caller.

    SecurityAgent may only be invoked by Apple software.
    Reboot.

    Friday, June 17, 2016

    Acceptance

    Since Dayna passed away I've felt strange in that I haven't felt guilty. I took responsibility for her care and have maintained that for years so ostensibly I bear some responsibility for her passing. I've worried that my lack of guilt indicated that there was something wrong with me.

    I've begun to accept that I did everything I could. Everything I could was limited by my capacity. I was also in a very difficult situation and my failures and missed opportunities were expressions of that. She was the only one who could save her but she was stuck, trapped by her illness and the habits it had instilled in her.

    I made mistakes but I never intentionally acted against her. Were there a solution available to me I would have acted on it. No one can know if there was a solution available to her. That weight was only hers to carry and she couldn't bear it. Many of us reached out to her because we felt responsible for helping her but we could to nothing to fix her. Sometimes that's just how things work out.

    Monday, April 4, 2016

    Dayna Gordetsky

    On Friday the 1st of April I learned that Dayna had passed away, having taken her own life. She had suffered a great deal both physically and emotionally. I don't think any of us will ever really understand what she was going through but we're coming together as a family to work on accepting it. If you're in the San Diego area and would like to attend the funeral, contact me privately for information.

    Saturday, February 13, 2016

    How I Became a Monster, Part I

    I don't really know how it happened and that's part of the problem. It's been like boiling a frog; the changes have been so slow it was hard to see the change happening. I'm writing this via stream-of-consciousness so it's not going to flow well.

    Dayna and I have been friends since 2000, and partners since 2002. I'll skip the love story and get to the plot. I've been abusive, neglectful, unsupportive, unsympathetic, and deceptive to her.

    I'm compelled to clarify abusive: I've never been directly physically abusive toward her. I've never struck her or anything like that. It would have easily been the last thing I ever did, plus it's just not my way. My sister and I were disciplined with spanking as children up until the point where my father left a bruise on my sister. He never spanked us again and after that he was able to bluff his way through spanking situations. That's how I remember it, anyway. I've never felt physically abused and actually respect my parents' approach to spanking.

    I've been an angry, anxious, depressed person for a long time. I took a lot of that out on Dayna. The rest I habitually keep inside until it boils over to take out on Dayna. It's taken me years to recognize the smallest part of this and I know I still don't see all of it. It's hard to know where to start as this has been ongoing for over a decade as part of everyday life.

    The first thing to know is that I have a grossly overdeveloped sense of responsibility. I cannot overstate this enough. If I can take care of something I will probably take responsibility for it. I can't possibly do everything so I fail to deliver on most things. Luckily for me this failure is silent. Few if any people know I've taken responsibility for this or that so I can feel inwardly guilty and unconfident but look capable and hard-working and dependable on the outside. I exist in a continuous state of being overwhelmed by my (notional) responsibilities and undermined by my (notional) failures. Saying out it out loud hasn't done much to reign this in.

    When Dayna and I really started getting to know each other she told me about a number of mental disorders she suffered from. Growing up I saw my mother as constantly under great stress from her work and my father doing everything he could to make things easier for her. Being a kid I could have been way off but I internalized what I thought I saw. So I took responsibility for making her life better. Had this been something I said out loud I'm sure Dayna would have said this was ludicrous. One person can't fix another, they can just be accepting and supportive. I was pretty accepting and supportive in the beginning but eventually this was overridden by my own shortcomings and misguided ideas how on people help each other.

    Things have gotten particularly bad between us over the last few months. Long-standing injuries to Dayna's spine have been getting worse, putting her consistently in a great deal of pain. The pain has made her irritable and my inability to really help her has been triggering my need to fix things. I haven't been fixing her so my mind has been bouncing between "I'm failing" and "There's nothing I can do". I've been physically supportive by way of getting her things, trying to make her comfortable, etc, but I've been almost completely emotionally detached and unwilling to accept that it's not for me to fix. This detachment made me completely emotionally unsupportive. My lack of sympathy and support causes her stress and anxiety which increases her pain and irritability which in turn causes me to be stressed, anxious, angry, and unsupportive. I've been there for her in ways that are completely superficial and rarely in ways that were genuinely meaningful.

    To be continued...

    Friday, August 21, 2015

    Phone Interv(iew|ention)

    I participate in the hiring process at work through phone and onsite interviews. The phone interviews are initial screenings for candidates intended to determine who we want to bring in for the onsite interview gauntlet. Our process has me doing Q&A with the candidate then I write up details of the interview and my recommendations. Then, one or more people higher up the chain make decisions about how to proceed.

    For both types of interviews there's a fixed time slot. Sometimes during a phone interview I'm confident within the first 10-20 minutes that the candidate isn't qualified. Often in these situations I like the candidate and would happily hang out with them over drinks and geekery but I wouldn't want my project to depend on them.

    I come to the conclusion that the candidate won't fit usually after I've given them a few extra meters of rope to climb up with and they just hang themselves more. If it seems unlikely that more rope is going to rescue them my standard bailout is "Those were all the questions I had for today. Do you have any questions for me?" It's polite but dodgy and it seems unlikely to me that the candidate doesn't realize that this is a signal that they've bombed.

    On at least one occasion the "questions for me" portion developed into a friendly conversation about skill development and I was able to provide a book recommendation that flowed naturally with that conversation. What I'd really like to do is just cut the interview, be forthcoming, say "the higher-ups may disagree but I think you still need to develop for a position like this", and then provide guidance about what the candidate can learn and practice to up their game.

    I see a few potential risks here. First, it's sufficiently out of normal (impersonal) interview protocol as to feel vaguely unprofessional. That said, is it really important to stay 100% professional (impersonal) when I can actually help someone? It's not just me though, I'm also representing my company.

    Second, I'm essentially cutting out the higher ups here and decisions about a candidate's progress through the hiring pipeline are under their purview. But, they make those decisions based on my view of the candidate so if I'm confident the candidate isn't a good fit it would be exceedingly unlikely for them to disregard that.

    Third, there's the risk that I might come across as providing a prescription for getting the job in the future. I can mitigate that to some extent by being as forthcoming about my intent as possible. Still, there will always be a candidate who gets the wrong idea despite my best efforts. If I've done my best I guess I can just let it be "their problem" but that feels kind of irresponsible.

    The easiest thing to do is just get out of the call and forget about them and that's mostly what I've been doing so far. I was asked recently by a candidate if I share my knowledge publicly and it surprised me to say "no". I used to but I haven't been recently and I feel that's kind of a loss. I should be teaching more.