Fail of the day #2: SSD drives out of spec

A while back I got an SSD drive with the question whether it could be repaired. At first glance it looked fine. But wait… the PCB is not level. In fact, it is seriously wobbly. What on earth happened to this SSD??

That PCB should really not have that shape..
That PCB should really not have that shape.

Looking closer at the flash ICs, it turns out several of them have pins that have disconnected from the PCB. Ok – that’s tough but nothing some careful soldering cannot fix.

But after inspecting the rest of the PCB it is clear that some inductors and capacitors have been torn off too. Wow – this drive took a real beating – nothing here to be salvaged except maybe a crystal or inductor. Could possibly be useful in other projects. Well – it goes into the to-be-used-in-future-projects box.

 

That’s only half the story… Quite a while back I came across another pair of PCBs. Another SSD, in fact – broken in two. Probably on purpose to prevent data extraction from the drive, but it gives a good opportunity to have a closer look at what is inside an Intel SSD drive.

2014-01-01_22-51-10
No rescue possible here, no matter how good soldering skills..

All the passive components (capacitors, resistors) are so small they are impossible to hand solder – no point in salvaging them. Most of the other components are held in place with epoxy, making removal impossible (but I will for sure buy Intel SSDs from now on – these things are built to last!).

The PCB seems to have multiple layers. There is for sure at least a ground plane in there, probably 1-2 signal layers too (hard to count them without a microscope).

The one component that might be of interest in other projects (adding memory to OpenWRT based routers comes to mind) is the SDRAM, is a Samsung K4S281632I-UC60 8Mbyte x 16 IC.  On the other hand – hand soldering that one will be… difficult (understatement) and require rebuilding the OpenWRT kernel. Hmm.. Will probably just recycle it.

FAIL of the day #1: Repair of NiMH charger

2014-01-01_16-25-22

When buying some GP branded NiMH rechargeable batteries about a year ago, a “GP PowerBank Travel” charger (model GPPB03GS) was included as a promotion. It’s a nice little charger that runs both off 220 V and 12 V (for use in car, I presume). It can charge 1-4 AA batteries, or 1-2 AAA batteries, with additional trickle charging after full capacity has been reached.

The charger worked well for some months, but one day after charging some batteries overnight the LED blinked red, and when removing the batteries it was clear something had gone wrong. See the melted plastic? Not good.

As this was very much an el-cheapo charger, one should probably not expect much from it. But I was still curious about how the 220V was brought down to more useful voltage levels, and if the internals would live up to safety standards. I was actually quite surprised at how complex and well designed the internals were:

Looking closer, there are three distinct parts of the PCB:

– 220V section, which is shielded with plastic blast shields towards rest of the electronics – nice! This is a classic Switched Mode Power Supply (=SMPS), with an optocoupler feedback loop. Looks like a SMD type TL431 voltage reference – very common in SMPS designs.

– 12V section, with some protection diodes, filter caps etc – but no other major components.

– Charger circuit, using an unknown controller IC. The markings have been shaved off. I just don’t understand why they go through the trouble of doing that… It’s not like this is some super classified product where the design should be kept secret at any cost.

A closer look at the components and PCB around where the plastic had melted does not give any clues of what has gone wrong – in fact nothing visible anywhere on the PCB indicates a catastrophic failure of the charger.

Bringing out the multimeter and measuring the output from both SMPS and 12V section shows that those voltages are all good – most likely the problem is instead in the unknown charging controller IC, or possibly some of the tiny SMD FETs, diodes etc that complement the charging IC.

So… given that this kind of charger cost next to nothing these days, I’ll leave it for dead for now. The SMPS works as it should, so maybe that part can be reused in some other project – I’ll stash it in the “possible-future-use” parts bin.

Using CoffeeScript to create QlikView extensions

I am not a Javascript fan. To be honest, I find it rather abusive. There are probably lots of theoretically sound reasons for designing the Javascript (=JS) syntax the way it is – but that does not make it more readable or easy to learn.

I am however a QlikView (=QV) fan. Or rather: QV is a very powerful data visualisation, exploration and discovery tool, but the main drawbacks is its somewhat dated visualisation options. In a world used to fancy angular.js based dynamic web sites and great looking HighCharts graphs, QV’s visualisation options aren’t quite up there.

QV however has a rather interesting “extension” mechanism. You can create object extensions that add new visualisations to QV, using JS to develop the extensions. Document extensions are used to modify the deeper levels of QV applications – very useful and cool (someone created a document extensions using the accelerometers in iPhones to enable new ways to interacts with QV applications – pretty cool!) but not the focus of this post.

So, we want to create QlikView object extensions. JS is the mandated language. Ouch. We can however use CoffeeScript to remove the bad parts of JS, making the code base smaller and more easy to read and maintain. CoffeeScript is very cool, lots of testimonies to it’s greatness out there (DropBox is using CoffeeScript these days, and have shared some experiences).

Note: You need to install node.js before CoffeeScript. I’ve tried this on both Windows 8.1 and OS X – compiling CoffeeScript works without problems on both platforms.

Tuns out this works quite well. Brian Munz of QlikTech has created a couple of very nice templates to make it easier to create extensions, I’ve taken the liberty of converting one of them to CoffeeScript, to show how easy it is to convert JS to CoffeeScript (and make my own future extension development easier).

The CoffeeScript code can also be found in my repo at GitHub.

The Javascript version first:

[code language=”text”]

var template_path = Qva.Remote + "?public=only&name=Extensions/template_simple/";
function extension_Init()
{
// Use QlikView’s method of loading other files needed by an extension. These files should be added to your extension .zip file (.qar)
if (typeof jQuery == ‘undefined’) {
Qva.LoadScript(template_path + ‘jquery.js’, extension_Done);
}
else {
extension_Done();
}
}

function extension_Done(){
//Add extension
Qva.AddExtension(‘template_simple’, function(){
//Load a CSS style sheet
Qva.LoadCSS(template_path + "style.css");
var _this = this;
//add a unique name to the extension in order to prevent conflicts with other extensions.
//basically, take the object ID and add it to a DIV
var divName = _this.Layout.ObjectId.replace("\", "_");
if(_this.Element.children.length == 0) {//if this div doesn’t already exist, create a unique div with the divName
var ui = document.createElement("div");
ui.setAttribute("id", divName);
_this.Element.appendChild(ui);
} else {
//if it does exist, empty the div so we can fill it again
$("#" + divName).empty();
}

//create a variable to put the html into
var html = "";
//set a variable to the dataset to make things easier
var td = _this.Data;
//loop through the data set and add the values to the html variable
for(var rowIx = 0; rowIx < td.Rows.length; rowIx++) {
//set the current row to a variable
var row = td.Rows[rowIx];
//get the value of the first item in the dataset row
var val1 = row[0].text;
//get the value of the second item in the dataset row
var m = row[1].text;
//add those values to the html variable
html += "value 1: " + val1 + " expression value: " + m + "<br />";
}
//insert the html from the html variable into the extension.
$("#" + divName).html(html);
});
}

//Initiate extension
extension_Init();
[/code]

Now the CoffeeScript version. A lot more readable, at least to me:

[code language=”text”]
template_path = Qva.Remote + "?public=only&name=Extensions/template_simple_coffeescript/"

extension_Init = ->
# Use QlikView’s method of loading other files needed by an extension. These files should be added to your extension .zip file (.qar)
if typeof jQuery == ‘undefined’
Qva.LoadScript(template_path + ‘jquery.js’, extension_Done)
else
extension_Done()

extension_Done = ->
# Add extension
Qva.AddExtension(‘template_simple_coffeescript’, ->
_this = this

# add a unique name to the extension in order to prevent conflicts with other extensions.
# basically, take the object ID and add it to a DIV
divName = _this.Layout.ObjectId.replace("\", "_")
if _this.Element.children.length == 0
# if this div doesn’t already exist, create a unique div with the divName
ui = document.createElement("div")
ui.setAttribute("id", divName)
_this.Element.appendChild(ui)
else
# if it does exist, empty the div so we can fill it again
$("#" + divName).empty()

# create a variable to put the html into
html = ""

# set a variable to the dataset to make things easier
td = _this.Data

# loop through the data set and add the values to the html variable
for rowIx in [0..(td.Rows.length-1)]

# set the current row to a variable
row = td.Rows[rowIx]

# get the value of the first item in the dataset row
val1 = row[0].text

# get the value of the second item in the dataset row
m = row[1].text

# add those values to the html variable
html += "value 1: " + val1 + " expression value: " + m + "<br />"

# insert the html from the html variable into the extension.
$("#" + divName).html(html)
)

# Initiate extension
@extension_Init()
[/code]

Note that you need to include the –bare option when compiling the CoffeeScript code:

[code language=”bash”]

coffee –bare –compile Script.coffee

[/code]

This will give us the following Javascript file, which is functionally equivalent to the first JS file above:

[code language=”text”]
// Generated by CoffeeScript 1.6.3
var extension_Done, extension_Init, template_path;

template_path = Qva.Remote + "?public=only&name=Extensions/template_simple_coffeescript/";

extension_Init = function() {
if (typeof jQuery === ‘undefined’) {
return Qva.LoadScript(template_path + ‘jquery.js’, extension_Done);
} else {
return extension_Done();
}
};

extension_Done = function() {
return Qva.AddExtension(‘template_simple_coffeescript’, function() {
var divName, html, m, row, rowIx, td, ui, val1, _i, _ref, _this;
_this = this;
divName = _this.Layout.ObjectId.replace("\", "_");
if (_this.Element.children.length === 0) {
ui = document.createElement("div");
ui.setAttribute("id", divName);
_this.Element.appendChild(ui);
} else {
$("#" + divName).empty();
}

html = "";
td = _this.Data;
for (rowIx = _i = 0, _ref = td.Rows.length – 1; 0 <= _ref ? _i <= _ref : _i >= _ref; rowIx = 0 <= _ref ? ++_i : –_i) {
row = td.Rows[rowIx];
val1 = row[0].text;
m = row[1].text;
html += "value 1: " + val1 + " expression value: " + m + "<br />";
}
return $("#" + divName).html(html);
});
};

this.extension_Init();
[/code]

Burning ISOs to USB sticks on Mac / OS X

For some reason i cannot get the easy-to-use tools out there for burning ISOs to work… Command line to the rescue:

First, make sure Homebrew is installed. It is strictly not needed for the burning-to-thumb-drive process, but will enable the progress indicators, which are quite nice to have for long running tasks. Now install Pipe Viewer from Homebrew:

[code language=”bash”]

$ brew install pv

[/code]

Now we need to figure out the device name of our USB drive. In a terminal window (you are using iTerm2 – right? Infinitely better than OS X built in Terminal app):

[code language=”bash”]

$ diskutil list

#: TYPE NAME SIZE IDENTIFIER
0: GUID_partition_scheme *251.0 GB disk0
1: EFI EFI 209.7 MB disk0s1
2: Apple_HFS Macintosh HD 250.1 GB disk0s2
3: Apple_Boot Recovery HD 650.0 MB disk0s3
/dev/disk1
#: TYPE NAME SIZE IDENTIFIER
0: GUID_partition_scheme *320.1 GB disk1
1: EFI EFI 209.7 MB disk1s1
2: Apple_HFS SSD backup 180.0 GB disk1s2
3: Apple_HFS Temp 139.6 GB disk1s3
/dev/disk2
#: TYPE NAME SIZE IDENTIFIER
0: GUID_partition_scheme *1.0 TB disk2
1: EFI EFI 209.7 MB disk2s1
2: Apple_HFS Macken_Ext Backup 999.9 GB disk2s2
/dev/disk3
#: TYPE NAME SIZE IDENTIFIER
0: FDisk_partition_scheme *8.0 GB disk3
1: DOS_FAT_32 WHEEZY 8.0 GB disk3s1
$

[/code]

/dev/disk3 is the USB thumb drive. I previously had another Wheezy image on it, thus its name.

Now unmount it:

[code language=”bash”]

$ diskutil unmountDisk /dev/disk3
Unmount of all volumes on disk3 was successful
$

[/code]

Nice. Now let’s write the ISO to the drive:

[code language=”bash”]

$ pv -petr ~/Desktop/debian-7.2.0-amd64-DVD-1.iso | sudo dd of=/dev/disk3 bs=128k
Password:
0:00:38 [4.94MiB/s] [====>                  ] 3% ETA 0:16:55

[/code]

Now let’s wait. Looks like it will take approximately another 17 minutes..

When done, just eject the thumb drive as usual, remove it and you have a bootable Debian install drive. Mission accomplished.

Netgear RN312 firmware upgrade 6.0.8 to 6.1.2

RN312 6_1_2 available

Seems Netgear just released firmware version 6.1.2 for those products that support the new UI (I believe UltraNAS and other non-Intel based devices does not get the benefits of the new version 6 and above firmware – or maybe they have reconsidered – not sure).

Updating is always a bit scary when you have a smoothly running system, but after reading the release notes they mainly covered more high end devices (as compared to the RN312 that I have), so why not..

I believe the upgrade went well, all the applications I have installed myself (CrashPlan, Monitorix etc) seems to be work ok.

RN312 6_1_2 installed.png

 

 

 

Moving CrashPlan cache and log directories to new locations

As discussed in a previous post, the ReadyNAS might run out of disk space on the 4 GB root partition if you install software other than that provided by NetGear.

In my case it was CrashPlan’s cache and log files that were filling up the root partition, with warning emails every 10 minutes that 81% of the root partition was used, 82%… 83%…, so they needed a new home. Turns out it is not too hard:

ssh into the NAS, then su to become root. Stop CrashPlan (if it is running):

[code language=”bash”]
root@RN312:/home/admin# service crashplan stop
Stopping CrashPlan Engine … OK
root@RN312:/home/admin#
[/code]

Make a copy of CrashPlan’s configuration file, in case something goes wrong:

[code language=”bash”]
root@RN312:/home/admin# cp /usr/local/crashplan/conf/my.service.xml /usr/local/crashplan/conf/my.service.xml.orig
root@RN312:/home/admin#
[/code]

Take a look at CrashPlan’s cache directory:

[code language=”bash”]
root@RN312:/home/admin# ls -lah /usr/local/crashplan/cache/
total 40M
drwxr-sr-x 1 root staff  106 Sep 25 03:00 .
drwxr-sr-x 1 root staff  258 Sep 25 21:31 ..
drwxr-sr-x 1 root staff  170 Sep 25 21:31 42
-rw-r–r– 1 root staff 8.4K Sep 25 21:31 cpft1_42
-rw-r–r– 1 root staff 1.9K Sep 25 21:31 cpft1_42i
-rw-r–r– 1 root staff 2.1K Sep 25 21:31 cpft1_42x
-rw-r–r– 1 root staff  23M Sep 25 21:31 cpgft1
-rw-r–r– 1 root staff 8.8M Sep 25 21:31 cpgft1i
-rw-r–r– 1 root staff 7.9M Sep 25 21:31 cpgft1x
-rw-r–r– 1 root staff  986 Sep 25 03:02 cpss1
root@RN312:/home/admin#
[/code]

Create cache directory in new location:

[code language=”bash”]
root@RN312:/home/admin# mkdir /home/admin/from_root/crashplan/cache
[/code]

Change the config file to point to the new location (using your favourite editor, vim used here):

[code language=”bash”]
root@RN312:/home/admin# vim /usr/local/crashplan/conf/my.service.xml
[/code]

Change
<cachePath>/usr/local/crashplan/cache</cachePath>
to
<cachePath>/home/admin/from_root/crashplan/cache</cachePath>

(Adjust as needed if you have selected some other place for the CrashPlan files.)

Now move the cache files:

[code language=”bash”]
root@RN312:/home/admin# mv /usr/local/crashplan/cache/* /home/admin/from_root/crashplan/cache/
root@RN312:/home/admin#
[/code]

Time to move CrashPlan’s log files. They are originally stored in /usr/local/crashplan/log/, let’s move them to /home/admin/from_root/crashplan/log.

[code language=”bash”]
root@RN312:/home/admin# ls -lah /usr/local/crashplan/log/
total 111M
drwxrwxrwx 1 root staff  346 Sep 23 04:41 .
drwxr-sr-x 1 root staff  258 Sep 25 21:31 ..
-rw-r–r– 1 root root   33K Sep 25 21:31 app.log
-rw-r–r– 1 root root   23M Sep 25 21:31 backup_files.log.0
-rw-r–r– 1 root root   26M Jul 12 19:50 backup_files.log.1
-rw-rw-rw- 1 root root     0 Aug 15 15:21 engine_error.log
-rw-r–r– 1 root root  6.4K Sep 25 21:31 engine_output.log
-rw-r–r– 1 root root  180K Sep 25 21:31 history.log.0
-rw-r–r– 1 root root  501K Sep 17 13:47 history.log.1
-rw-r–r– 1 root root  501K Aug 25 08:10 history.log.2
-rw-rw-rw- 1 root root     0 Aug 15 15:24 restore_files.log.0
-rw-r–r– 1 root root   13M Sep 25 21:31 service.log.0
-rw-r–r– 1 root root   26M Sep 23 04:41 service.log.1
-rw-r–r– 1 root root   26M Sep 17 14:35 service.log.2
root@RN312:/home/admin#
root@RN312:/home/admin# mkdir /home/admin/from_root/crashplan/log
root@RN312:/home/admin#
[/code]

Find the fileHandler tags (there are 4 of them dealing with log files), modify them so they point to the new log directory. So, once again edit /usr/local/crashplan/conf/my.service.xml.orig, part of mine looks like this after moving the log files. Change the paths as neeed for your choice of new directories:

[code language=”bash”]
<serviceLog>
    <fileHandler append="true" count="2" level="ALL" limit="26214400" pattern="/home/admin/from_root/crashplan/log/service.log"/>
  </serviceLog>
  <serviceErrorInterval>3600000</serviceErrorInterval>
  <historyLog>
    <fileHandler append="true" count="10" level="ALL" limit="512000" pattern="/home/admin/from_root/crashplan/log/history.log"/>
  </historyLog>
[/code]

Start CrashPlan again:

[code language=”bash”]
root@RN312:/home/admin# service crashplan start
Stopping CrashPlan Engine … OK
root@RN312:/home/admin#
[/code]

And finally check free disk space on /:

[code language=”bash”]
root@RN312:/usr/local/crashplan/log# df -h
Filesystem      Size  Used Avail Use% Mounted on
rootfs          4.0G  1.7G  1.8G  49% /
tmpfs            10M  4.0K   10M   1% /dev
/dev/md0        4.0G  1.7G  1.8G  49% /
tmpfs           2.0G     0  2.0G   0% /dev/shm
tmpfs           2.0G  5.8M  2.0G   1% /run
tmpfs           2.0G     0  2.0G   0% /sys/fs/cgroup
tmpfs           2.0G     0  2.0G   0% /media
/dev/md127      2.8T  1.1T  1.7T  39% /data
/dev/md127      2.8T  1.1T  1.7T  39% /home
/dev/md127      2.8T  1.1T  1.7T  39% /apps
root@RN312:/usr/local/crashplan/log#
[/code]

49% – nice!

Installing Debian on old ASUS motherboards

Having a couple of decommissioned ASUS motherboards (M2NPV-VM and A8N-VM CSM), as well as a 19″ cabinet with ATX cases in it, they could together be a setup for lab work, trying out Linux server stuff, as a test bed for network gear etc.

Installing Linux (Debian) is usually pretty easy, a couple of snags along the way though.
So, note to self: read this if these motherboards need to be reinstalled sometime. It will save you/myself some time.

Booting from USB flash disk

  1. The BIOS of both boards need to be changed so that the flash disk is 1st disk (before the SSD also installed), and also 1st in boot order. Otherwise it will not boot from the thumb drive.
  2. Install Debian as usual.
  3. Once you get to the GRUB installation part of Debian install, follow the default setting and install to first disk. Which is the flash thumb drive, I know. But trying the get the Debian installer to install GRUB anywhere else just failed consistently – I have no idea why. Should have worked to install it to /dev/sdb (which is the SSD).
  4. Reboot into recovery mode with the thumb drive still inserted (as GRUB was installed to it. remember?). You should now end up in a command line shell.
  5. Do a “grub-install /dev/sdb” to install GRUB to the SSD. The devices might be different depending on the installed hardware, check with “ls /dev”, “du” and related commands, to get the device name of the SSD
  6. Reboot, quickly remove the thumb drive during the reboot, and GRUB should now appear, served from the SSD.

 

Windows 8 and Debian Wheezy dual boot

An old Dell XPS M1330 laptop has been collecting dust around here for ages. It’s one of those “yes… I am sure that laptop will come in handy some day…” machines, and I finally took the time to set it up as test machine for Windows 8 and Linux. I also had an unused SSD drive that could replace the old and slow HD in the laptop.

Time to get to work!

Install Windows 8

  1. Get the Windows 8 installer. Run it, then go through all the steps until you get an option to “Install by creating media”. A Windows machine of some kind obviously needed for this. This will create a bootable USB flash drive or an ISO, with all the Windows install files on it.
  2. You can actually install Windows 8 directly from the flash drive, but once you try to activate Windows it will tell you that the product key can only be used for upgrades. I found this out the hard way -> had to re-do the whole process.
  3. Get a copy of Windows XP, Vista or whatever earlier Windows version you can find. I happened to have a bunch of XP Pro licenses taken from old computers over the years. If going the XP route, it might be worth installing from XP SP3 (rather than SP2 or earlier), IIRC the pre-SP3 XP versions were rather crappy.
  4. Windows XP + SSD = <FAIL>. As the Windows 8 license was an upgrade, I had to get some prior Windows version installed first. Turned out that Win XP SP3 didn’t play nicely with the SATA2 SSD I had installed. Probably some missing drivers in the XP installation – SATA2 just had not been invented when Win XP was hot, I guess.
    I had to change a couple of BIOS parameters handling flash cache and SATA emulation (reverting back to some older ATA variant, I believe. Not sure, but it worked).
    The XP installer then detected the SSD and fired up as expected.
  5. Install XP from CD/DVD, as was done in the old days. No need to apply updates etc once it is installed. I didn’t activate Windows XP Genuine Advantage either.
  6. While in XP, start the Windows 8 installer from the flash drive created step 1 above. From here it’s a pretty easy ride, think I went with the defaults most of the way
  7. Windows 8 should then be installed, and XP gone. Nice.

If you have a fresh-install product key for Windows 8, you can most likely skip the XP installation steps above, of course.

Install Debian

  1. Download UNetbootin to the new Windows 8 machine. No need to install, it’s a standalone application.
  2. Use UNetbootin to create a bootable Debian installation flash drive. All the actual Debian files will be downloaded during the installation, so the flash drive can be small (I used an old 256 MB one). I went with the Debian Stable_Netinstall, worked well.
  3. Reboot the computer to start the Debian installer. If the entire disk was allocated to Windows during that installation (it would have been, unless you repartitioned it yourself) you need to make some space for Debian. The Debian installer allows you to do this in the partitioning section. Go to the partition that Windows is installed on, hit enter and you can edit the size of the partition. Apply.
  4. While still in the partitioner, move to the now free/unused space on the SSD, and use the assisted partitioner for all unused free space. Going with the recommended option (all data on same partition) is fine. You will then get /dev/sdb5 and /dev/sdb6 partitions for general use and swap, respectively.
    NOTE: When booting from the USB flash drive it gets the name /dev/sda. The SSD is /dev/sdb, with the Windows partition being called /dev/sdb1.
  5. The Debian installer can be a bit cryptic the first times you use it, but it’s not too bad. Going with the defaults is usually fine.
  6. One of the last steps is to install the GRUB boot loader. Now, this can be done different ways. The easiest is to just follow the suggestion to install GRUB to the Master Boot Record. This will overwrite Windows boot loader (which in Windows 8 is actually pretty nice, with graphical UI, mouse interface etc).
  7. When the Debian installer finish and the computer reboots, quickly remove the flash drive and if all is well GRUB should now kick in, showing Debian side by side with Windows 8.

If you want to use the Windows 8 boot loader, you need to reinstall it. I first thought I would do this, but changed my mind.. GRUB might not have the pretties UI around, but it works.

I think the last part of this article might be useful if you still want to switch back to using Windows 8 boot loader.

Closing thoughts

Now that XP is no longer anywhere on that SSD, it should be safe to switch the BIOS back to proper SATA mode. Windows 8 didn’t boot when I did that though… Not sure why. After switching back to the old legacy mode both Windows 8 and Debian boots fine, so I guess that decides it.

I did actually also do some initial work on the SSD, upgrading the firmware of it, as well as using the GParted Linux distro on a flash stick (once again using UNetbootin to create the flash disk) to create a FAT32 partition and align it as described in this post. No idea if that was really necessary..

Misc sources providing input for the above

http://www.howtogeek.com/99060/how-to-dual-boot-windows-8-and-linux-mint-on-the-same-pc/
http://unix.stackexchange.com/questions/76932/installing-debian-7-besides-windows-8
http://askubuntu.com/questions/217904/installed-ubuntu-my-windows8-not-booting/218006#218006

JBC CD-2BC soldering station unboxing

Got a new toy the other day…

After using the same ERSA MS 6000 soldering station for the past 20 (!) years or so, it was time to upgrade. Nothing wrong with the old one really, except that it was hard to get new tips and that heating it up took a minute or two.

Getting a Chinese rip-off from eBay would be easy, but if the next soldering station would also last 20 years, why not get something slightly better?

JBC has a good reputation and seemed to have good value for money. So – here are some pics from the unboxing. Enjoy!

Two dollar variable fan controller

Well – given current exchange rate, AU 2.59 = USD 2.33.. So it’s not quite a two dollar product, but pretty close. And “2.33 dollar fan controller” did not make for a nice subject line…

After the earlier post about variable 12V fan controllers, it might be worth looking at what is available on Ebay. Turns out you can get a variable controller delivered anywhere in the world for USD 2.33 – pretty amazing! It looks deceivingly like the Zalman Fan Mate 2 too:

Zalman Fan Mate 2 (left, USD 7) and eBay ditto (right, USD 2.3)
Zalman Fan Mate 2 (left, USD 7) and eBay ditto (right, USD 2.3)

The eBay controller only has one 3-pin male connector (where the fan connects), and then a soldered in wire with a 3-pin female connector, for attaching to the PC or other equipment.

The Zalman on the other hand has a 6-pin male connector on one end, a special Y-cable (it comes with the Fan Mate 2) is then needed to hook up the controller to fan and PC. Both variants of course work, the Zalman approach is maybe slightly better, as it allows the controller to be mounted closer to an inside corner, without the cables being in the way. Not a major difference though.

Looking inside the eBay controller, it is obviously different from the Zalman. For starters, it has a NEC B772 P PNP medium effect transistor in there, rather than a voltage regulator. I could not find a datasheet for that particular NEC device, but I am pretty sure it is more or less identical to ST’s 2SB772.

There is also a TL431 adjustable voltage regulator in there, together with a second SOT23 transistor market J6, it might be a S9014 NPN transistor (or equivalent).

So, in essence the eBay controller is also a linear regulator, but based off an adjustable regulator (rather than the fixed-voltage 7805 that the Zalman uses), with an extra power transistor to boost current. The extra transistor is needed, as the TL431 can only sink 100 mA on its own.

All good so far. But when reverse engineering the eBay controller, the schematic just doesn’t add up. Below is what the eBay controller looks like, with the above assumptions on components – and this is not a working circuit, as far as I can tell (or is it? Feel free to add your expertise in the comments!).

eBay variable fan controller - except that the circuit is a bit weird.. Need to re-check those PCB traces!
eBay variable fan controller – except that the circuit is a bit weird.. Need to re-check those PCB traces!

So…. either I made incorrect assumptions regarding what SMD components are used in the eBay controller, or I just didn’t check closely enough how the PCB traces were connected. Time to bring out the multimeter to check those traces – more to come on this topic.