Blog

  • Remove ^M characters and more with repl.bash

    Hey folks, this is a goody but quicky.

    First off, respect the character encoding of a file. I don’t know how many devs out there violate this rule, but if you’re like me and Joel On Software, you’ll agree that you should respect the character encoding of a file.

    If you happen to see that your file has gotten code page 1252 aka Windows-Latin 1 in it, then you’ll have a variety of random characters like ^M or ?~@~Y or ?~@~\ or ?~@~] .

    Well, I wrote a script that removes these guys and makes sure that the file format of Unix is respected. Here it is:

    #!/bin/bash
    #
    # By: barce[a t]codebelay.com
    # ——————-
    # this script replaces microsoft special chars with plain ol’ ascii
    #
    # usage: ./repl.bash filename
    #

    # replace ^M characters
    perl -pi -e ‘s/\x{0D}\x{0A}/\x{0A}/g’ $1

    # replace garbage with single-quotes
    # ?~@~Y
    perl -pi -e ‘s/\x{E2}\x{80}\x{99}/\x{27}/g’ $1
    perl -pi -e ‘s/\x{80}\x{99}/\x{27}/g’ $1
    perl -pi -e ‘s/\x{80}\x{9c}/\x{27}/g’ $1
    perl -pi -e ‘s/\x{80}\x{9d}/\x{27}/g’ $1

    # replace garbage with asterisk
    # ?~@?
    # e280 a2
    perl -pi -e ‘s/\x{E2}\x{80}\x{A2}/\x{2A}/g’ $1

    # replace garbage quotes with plain quotes
    # start: ?~@~\
    # close: ?~@~]
    # e2 809c
    perl -pi -e ‘s/\x{E2}\x{80}\x{9C}/\x{22}/g’ $1
    perl -pi -e ‘s/\x{E2}\x{80}\x{9D}/\x{22}/g’ $1

    # replace garbage hyphens with plain hyphens
    perl -pi -e ‘s/\x{E2}\x{80}\x{93}/\x{2D}/g’ $1

    # replace garbage with ellipsis
    perl -pi -e ‘s/\x{E2}\x{80}\x{A6}/\x{2E}\x{2E}\x{2E}/g’ $1

  • Getting Around the Politics of Subversion with git

    This is the nightmare scenario. You are working with a coder who overwrites your changes in subversion. You’ve told this coder once, twice, three times, “Hey, please don’t do that. Hey, let’s talk about your changes before you commit them.”

    But this coder for some reason thinks that he or she is the gift of the gods when it comes to coding, and continues to overwrite your changes.

    This is where git comes in. If I had learned about this feature of git and the idea of accepting or rejecting changes in git sooner, I would have avoided the whole nightmare of re-comitting code and lengthy merge debates.

    Most projects you work won’t involve the worst case above. Most of the time, there will be a great developing rule that says never commit bugs into subversion. But whenever you have to re-factor code and commit each line, branching and then later merging can be an issue in subversion, and it’s slow too.

    On a project that I’m working on now the client wants only good code in svn which is great, and so I’m using git with svn. I got this idea thanks to Jakob Heuser. Thanks, Jakob!!!!

    This is where git comes in. Here’s a quick cheat sheet and it assumes you are using GitHub:

    mkdir newlispoauth
    cd newlispoauth/
    git init
    touch README
    git add README
    git commit -m ‘first commit’
    git remote add origin git@github.com:barce/newlispoauth.git
    git push origin master

    Now we have to pull in changes from subversion:

    mate .git/config

    In the config file add something that looks like this:

    [svn-remote “newlispoauth/trunk”]
      url = http://codebelay.com/newlispoauth/trunk
      fetch = :refs/remotes/newlispoauth/trunk

    Now we’re gonna fetch the subversion repo:

    git-svn fetch newlispoauth/trunk
    git checkout -b local-svn/trunk newlispoauth/trunk
    git svn rebase
    git checkout master
    git merge local-svn/trunk
    git mergetool # if there are conflicts with the README file above
    git add README # if you had to make changes with the mergetool
    git commit
    git push origin master

    Now you are working with “master” and “local-svn/trunk”.

    “master” is for your changes to share with your team on git-hub
    “local-svn/trunk” is for you and where you push changes to subversion.

    You basically pull in changes from newlispoauth/trunk and do your work in local-svn/trunk.

    Let’s put the changes in master into “newlispoauth/trunk” and commit those changes to subversion:

    git checkout local-svn/trunk # you did commit your changes in origin right?
    git merge master
    git-svn dcommit
  • The Clean Slate

    One of the great things about America is that you can start all over again by moving to a new town, or by just simply doing the thing you are afraid to do.

    I had the offer letter in my hand. It was for a profitable company that just secured enough VC to outlast the Great Depression, The Sequel. Experience told me that this was the most sensible thing to do, so I signed the offer letter. I would be making more money that any previous job I had, and the position would be this cushy middle-ware coder.

    But something nagged at me. So much of life is an illusion. For some reason, I felt that the secure, money-maker of a job was an illusion. I also felt that I was taking myself away from the game of business where I would right all the wrongs done to me. I am still aching freshly from some wounds that people gave me; people who dishonored me by claiming I was a coder of poor quality when just weeks before they were saying I was the best of coders. I worked weekends for these people when none of the other developers would. I volunteered the most for being on-call, and they dishonor me.

    Anyway, dear Readers, you in the industry know who these dishonorable people are, and because I decided to take the harder road of freedom, I am free to speak of them here. Why? Because in a world where faint praise is damning, there is no amicability in that. Because I will never, ever use the dishonorable as a reference. I will only use honorable references from honorable men, if I have to, but I really just want to free myself from references altogether. I will hack things out project by project and by the skin of my teeth, and be free.

    I turned the offer of security down.

    I chose total personal freedom.

    The crumb of freedom tastes better than the banquet of slaves.

    I have been learning so much in the situations that I’ve been in these past few days.

    My code is my reference.

    What would you choose? Security or Freedom?

  • How the FBI Would Have Tracked Palin’s Hacker If He Were L33ter

    It’s been a few weeks since Palin’s “hacker,” David Kernell, got caught because he left a reference to ctunnel.com in the screenshots of Palin’s email.

    Enjoy Jail, Punk!

    What if David Kernell was able to remove the references to ctunnel.com? What would the FBI have to do to catch him? And how would a would-be hacker avoid detection?

    1. The FBI would have to obtain records from Yahoo and 4chan, and these records would hopefully reveal the IP addresse(s) that accessed Palin’s account.
    2. The FBI would also have to search data retrieved from a descendant of Carnivore, a wiretapping software used for the Internet c. 2001. Such data could reveal the MAC address of the hacker. The MAC address would lead to the place of purchase for David’s network card.

    Even if David Kernell photoshopped ctunnel.com from the screenshots of Palin’s email, the FBI could still have catched him in two ways:

    1. The IP address at Yahoo or through Carnivore-like software would have led the FBI to ctunnel and then to David’s IP address.
    2. The MAC address gotten through Carnivore-like software at David’s ISP (which is not really likely) would have led the FBI to the store at which David’s computer was purchased. Something like “ping davids_IP && arp -a” would have to be run on a LAN level.

    So how else could David have avoided detection?

    1) He could have chained proxy servers.
    2) He could have used a combination of p2p networks like the ones used for downloading movies and music to get to the web pages.

    But even then, the FBI would still be able to catch him.

    The FBI could still log name server look ups, the very techology that allows your computer to see www.fbi.gov as 64.212.100.43. If a log of name server look ups matched the time stamps of when the hacked pages were accessed, then the FBI would have a strong reason to believe that the hacker was using the ISP that provided the name server lookup, and from there get to David.

    Okay, okay. Let’s say that David disabled name server lookups. Could the FBI catch him if he went as far as that?

    If somehow his MAC address got leaked that would lead right to whoever purchased his computer’s network card. If he paid cash for his network card on the black market, or Craig’s List, then the FBI would be on a wild goose chase.

    I think if he took all the precautions above, the FBI would be at a total loss for tracking Palin’s Hacker if he were l33ter.

    Thoughts?

  • Reflections on the Last Recession: 2002-2005

    I remember the last recession clearly. I arrived back home after living in Italy to cheap rents and a San Francisco that felt very empty. I found a huge room in Cole Valley with a hot tub and garden for $600 a month. I thought that was expensive at the time in the Winter of 2002.

    Here’s the run down:

    • 2002 – I spent this year re-adjusting to being back in San Francisco. I felt alienated. When I arrived to Italy, the first person to greet me was a beautiful woman who said, “Good Evening, can I help you? Are you lost?” When I came back to the good ol’ US of A, and greeted the first person I saw, her response was a rude, “Who are you? Do I know you?”
    • 2003 – I cashed out a good chunk of my savings. I spent a year learning ancient Greek and studying philosophy. What I got out of that study was how beautiful and enough philosophy is, and that at the same time the world is so indifferent to such beauty. I really wanted to be some gal’s boyfriend at this time because I could use a good portion of my free time to help her out. I hit a low point employment-wise when I was stacking candles at Planet Weavers. When the money ran out, I barely had enough to eat. However, thanks to a few good people, I started getting odds and ends tech jobs. The best thing is that I found out my natural work and sleep schedule. My body likes to work at 9am and sleep at 1am.
    • 2004 – I started to see more and more tech jobs. I began to see the blossoming of a social life I never thought possible. I went out on a lot of dates this year, and I’m still friends with the women I’ve dated from this time.
    • 2005 – I really feel that this year was the worst and the best. I really believed in the whole Web 2.0 thing, but I found that caused a lot of conflict with some of the folks I was working with. I learned that if you’re gonna go cutting edge, it’s gonna shake your world. I lost a lot of old friends because of Web 2.0 ambitions, but I gained a few new ones. It was worth it.

    Although I had the chance to take a full-time job in 2003, I stayed lean and focused on learning Web 2.0 technology until I could work for the Web 2.0 site that I wanted to in 2006.

    It’s strange looking back on those times. I know I’ve changed a lot, but *some* of the very people who have helped me have not. I have tried to help these in whatever way I can, but they all just seem stuck. Then there are others who have helped me and I have helped and they’re not stuck. They’re thriving. Will they continue to thrive?

    Take aways: Use an economic down-turn as an opportunity to find out who you really are. I’ve worked with so many people who were just in it for the money and not real geeks at heart. Steer clear of these people, because they are toxic from getting involved with the wrong technologies for the wrong reasons.

  • Darwin Ports when OS X Idiosyncracies Get You Down

    I started using Darwin Ports when tools like hping3 and ctags didn’t work quite right on OS X. For example, the berkeley packet filter on OS X is totally different from the one on Linux, so hping3 wouldn’t compile correctly. Also, ctags in OS X doesn’t have the -R flag for recursion.

    With Darwin Ports, I can install libraries and executables that don’t work quite right on OS X but work great on Linux.

  • Sending Files with hping3

    This is a quick cheat sheet on how to use hping3 to send a text file. Thanks Gr@ve Rose for inspiring this.

    The target machine should be listening like so:

    hping3 192.168.0.108 –listen signature –safe –icmp

    The source machine should be set up like so:
    hping3 192.168.0.108 –listen signature –safe –icmp
    hping3 192.168.0.108 –icmp -d 100 -c 2 –sign signature –file ./test.txt

    -d specifies the data size
    -c specifies the number of pings to send. We just need 2 pings to send the test file below.

    test.txt just contains lolspeak:
    —- start —-
    oh hai
    we bustin pass dey bad fire wall
    yay!
    —- end —-

    I haven’t tested this out with binary files, but I’m pretty optimistic that a uuencoded file would get through, and could be re-assembled at the target server. Also, hping3 can be used to turn on a network service like sshd if it receives the correct “signature”.

  • Who Is An Inspiring CEO?

    Who is an inspiring CEO?

    For my next job, I want to work for an inspiring CEO. Not someone who gets whispered about in the industry as being “weird,” or a “control freak.” Not someone that puts doubt into people. I hate getting asked questions that sound more like accusations, “You work(ed) for him/her?” I’ve always defended my CEOs record.

    Here’s what I mean by inspiring:
    1) Says the right things that motivate people in the company.
    2) Recognizes when people would gladly sacrifice for the company and acknowledges it. I’m talking about an honor based company.
    3) Welcomes convivial debate.
    4) Leads by doing.
    5) Would probably find a solution to the Kobayashi Maru test.

    If you know of an inspiring CEO, I would love to hear about it.

  • How To Save Money with Memcache

    Jeremiah Owyang twittered, “I’m asking every CEO I meet about their take on the economic downturn.” This is great advice, but I’ve got my own reasons for thinking so, and would like to invite Jeremiah to share his reasons here.

    For me, I’ve been talking to different CEOs and telling them the importance of scalable technologies like memcache to save money NOW. I ask CEOs about the economy to hear their concerns and to see for my own business reasons if they’ve implemented memcache on their servers.

    The math is really simple. Most social network sites can waste anywhere from 5000 to 10000 seconds per week on slow, un-cached database queries. Assuming the ability to handle around 30 requests per second, many large websites miss out on an additional 150000 to 300000 page views per week. Not to mention that you lose a user for every 30 seconds of wait.

    If a busy site with stats above implemented memcache they would have:

    • not lost between 165 and 335 users per week.
    • 150000 to 300000 more page views per week
    • $500 – $1000 more per week on a 100 x 720 skyscraper with the right ctr and ecpm — that’s an additional coder

    If you want to see the above savings, feel free to contact me at barce[a t no spam]codebelay.com .