Thursday, July 2, 2015

Tpi calibration Specification



Tpi calibration
Specification
This test will be determine the tracks per inch (TPI) setting for each head of the drive under test
This test is considered as (drive validation test) dvt test
The actual VTPI test isnnot steps like preamp and jog- but it basically perform sector reads and computer error rates
The error rate is based on each bit performance
First collect all information and data necessary to the test-the estimated testing time per head is 20 minutes

Its sequence

3.1 Test algorithm

Preamp gain calibration
Set Kilo Flux Change per Inch (KFCI) to 1, set TPI to 1. The flux change refers to the number of magnetic flux change to the tangential direction of media. Both KFCI and TPI have tables which contain the number of flux changes and tracks respectively. In the table, number 1 has the largest TPI value. As the test goes on, the number of tracks per inch decreases.

2T jog optimization

Set TPI from 1 to the number of cases possible in the table mentioned above and do the following:
a.       FIR, Bias, Iwrite, WPC, etc.
b.      Read and measure error rates using sector mode and “manually” count errors.

Decide which TPIs to set based on error rates and thresholds. Before starting test, a target error rate is assigned. This is defined in input data parameter 12 and 13. Starting from the largest possible number of tracks per inch, the TPI test runs until the error delta gets lower than the threshold value.

Save all debug and results to disk. 

3.2 Test Structure Initialization

All structures used during this test will be set to initialized values prior to any collection of data. 

3.3 Input Data

Data is passed to the test through the SPT/user interface. The structure for the data is defined in section 4. 

3.4 Starting the test

Upon start-up control can be passed to the VTPI test through the Process Self Test (PST) sequencer table or through the Vendor Specific Command (VSC) Execute File. Upon the conclusion of the VTPI test control reverts back to the PST sequencer table or exits if it’s started via the VSC command.

3.5 Errors

This test is designed to continue as long as possible in the event of an error. Only fatal errors, errors that would make the test data or results meaningless, will result in immediate exit from the test without the setting of the VTPI. 

Touchdown calibration Specification

Touchdown calibration
Specification
Touchdown calibration is used to find maximum power to apply this to the heater element for each head on a drive
This done by applying power to heads until they maximum or allowable power has been applied
The head to media contact happens only during calibration process
The head not supposed to make contact with media in normal read-write –in order not to register this stress in the servo bias
The result of this check will be saved in 0x46 file
This check id is 0x6f

First collect all information and data necessary to the check
The estimated testing  time per head is 2 minutes
The number of testing zones is defined in 0x6f

Its sequence
1-get check data from DATA_BUFFER
2-run preamp gain calibration and 2tjog test
3-set write buffer to  zero-then do a write cylinder head sector (CHS) to collect reference information
4-??accumulate servo flex bias>>what about it?
The head gets close to the media-then head arm deviates from the track
So servo flex bias circuit increases amount of current to take head arm back to desired track 
5-save results to file id 0x96 and 0x46


But be ware that all information and storage will be final cleaned up-update test data in 0x96 and 0x46 with the maximum read/write power settings for each head/zone
Then dynamic fly height (DFH) can be executed

Input data structure
1-0x6f  

 

Writer-Reader Gap Calibration Specification

Writer-Reader Gap
Calibration Specification
This test is for determining writer-reader gap for each head on the hard drive under the check
And all failure analysis is stored in reserved area (zone 0)
Then file 0x46 is the file which contain writer-reader gap information or gap calibration
 So 0x46 is optimized channel parameter which stored in zone 0-so these optimized parameters in 0x46 are used in read and write operations of zone 1 through zone 20

And 0x47 is optimized channel parameter which stored in in flash area inside soc chip

After calibration this 0x46 file used in read and write timming

0x46 optimized channel parameter for writer-reader gap
0x6a is a check command id

Its sequence
-2tjog
-wrro
-calibrate the writer to reader gap at the inner diameter (id) of the drive under the check-Optimized gap value of the rest area is calculated using algorithms
-save gap calibrate information to file 0x46 -0x9c

Starting the test
Start-up control can be passed to the checking module through
1-pst sequencer table
2-vendor specific command (vsc) execute file
Then control reverts back to the caller which is the controller code

Input data structure
1-code file name:cocode.bin (file id 0xc4)
2-data file name:codata.bin (file id 0xc5)
3-result file name:wrcalres.bin (file id 0x9c)
Thus


1-test command
Writer-reader gap calibration check which called 0X6A
2-chamber temp
This is an environment temparture of drive under check
3-headdcm
It is the single character that check dcm of the head on the drive under the check
4-mediadcm
It is the single character that check dcm of the media on the drive under the check
5-head qual
The zone to be tested-it must be represented in bit map variable
Only one zone will be calibrated –default test zone is the last zone or last user zone

Thus the following bitmap 0x100000 is for 21 zones
And this 0x100402  is to zone 1,10,20
Result file
Results of the check written in the 0x9c

mini arco test flow

mini arco test flow 
Mini arco check will perform this list to optimize the channel of zone 0
Thus mini arco should distinguish where this optimization is (channel-head/preamp)
Now iniside the 0x47"optimized channel parameters" file
 

But first we need to know following
1-Preamp gain
There are preamp gain areas in preamp chip-this optimization chooses proper values for head signal amplification
2-2t jog
Because of fast speed revolution we had a problem in signal which came from media
As a signal amplitude did not meet minimum requirement- So 2t jog which called vga (vartiable gain amplifier) amplify the signal at each jog sweep position
And vga is the lowest value in sweep process
Thus head can take highest value from the track
3-VM (Viterbi Margin) Jog
same 2t jog but vm finds parameters with lowest error rate among testing sweep points  
4-Wrro (write repeatable read out) delay calibration
The head which head read from media consists of
Servo burst-RRO>>>(are used for servo To make self adjustments of positioning)
data
this optimization controls timming between (servo burst-data)
5-Mr bias
Preamp supports two types of heads:
Gmr:giant magnetic resistive
tmr :tunnel magnetic resistive
preamp provide two ways to head bias :
voltage bias
current bias
6-mra:it is the same above
                                                                                          7-Fir (finite impulse respon)
Main channel equalization
6-Baseline correction
Remove dc offset of the signal
7-write precomp (wpc)
Minimize phase shift to minimum as desired in signal
8-analog boost
Amplify signal
9-analog cutt off
It is a filter that allow to low frequency and prevent noise high frequency one
12- ATI (Adjacent Track Interference) check
Since distance between tracks is so little ,,so interference proplem occurs with adjacent track with lossing data problem
This optimization solve this problem
13-RFPE (Reverse Field Partial Erasion)
when a head is writing data to media ,,there is chgance to occur reverse magnetic flux
thus loss data partially
by optimizing RFBE we can prevent that
14-Write Current
The preamp chip can control overshoot amplitude-decay period of writing current
Thus overshoot amplitude occurred when head writing data
With this optimization you can control with this overshoot amplitude and decay period of writing current 
15-wpc as above
16-mnp (media noise process)
Reduce noise media

Mini-arco log file will save in 0x90
Summary result file will save in  0xc0
Thus

File 0x44
                        
check command
mini arco check which called 0x44
Chamber temp
This input value specifies the environment temperature of the drive under check
head dcm
this input value specify the responsible head of the head installed on the drive under the check
the field must be in the form of translated head DCM based on the head DCM for the given product family.
Media DCM
this input value specify the responsible media of the media installed on the drive under the check
this field must be in the form of translated media DCM based on the media DCM for the given product family

mini arco

mini arco 

Western digital uses soc(system on chip) which contains:
Cpu-flash-ram-controller-servo
Channel
Arco is set channel for zone with (trex-windex)- This channel is connected to preamp chip
Remember that working on hard drive software not eeprom on the flash i.c
What is the zone?
First media content 21 zones
Zone0 or reserved area and it is outer diameter o.d which contains manufacturer data
Zone20 is a inner diameter i.d 

What is the mini arco?
Mini arco is to optimize channel parameters for drive maximum read or write ability
And this optimization is to extract best cahnnel parameters for the operation (for zone0)
And
ARCO process optimizes data zone 1 through 20

mini arco have a priority to coding in zone 0 and every thing is done under this priority (require files-download it)

executing mini arco test (instruction code and data on zone 0)
all files that required and  downloaded  to the flash of SOC
1- Permanent Overlay This is the instruction code and data which will be stored in the reserved zone
2-servo dvt usually media cannot stay at the center perfectly because of fast speed revolution –so servo adjustment has to be done for this
3-arco data file "head,media,preamp parameteres"
4-arco code file "module code"

Now these files are downloaded-now we can do mini arco test by executing vsc command

hard drive recovery tools part 2

hard drive recovery tools

Sediv is one of advanced utilities that’s why sediv support arco
What is the arco?
Arco is a way to repair firmware problems and adaptive it to work with any module
Arco can do these following:
1-set reading and writing heads
2-adaptive firmwares for hard drive so you can use firmware for any models like western digital models you can modify one model to use with other models
3-set spaces between tracks
4-set spaces between heads whichg called micrjog and set translator ans set format and many other functions
Well arco is seems so great what about sediv?
Sediv supports arco –this so good sediv presents many tools I mean tool for western digital and other for Seagate and other for Toshiba etc
It supports reading and writing rom and generate rom and throw edit adaptive you can modify modules
Scan modules and reading it and writing tracks quickly
Cut zones and set propriate arco which u need to repair hdd
And test heads and do logical scan excellent
Self scan and unlock for new families
there is a tool better than sediv we sell it with a low price 

Data recovery equipments
The most famous in these equipments is :
Data compass and hdd doctor
Pc udma..???

Some will ask pc you mean pc 3000 – yeah but not chinese one
Pc udma is from ace laboratories which is russian not Chinese
Data recovery equipments used to recover data from urgent cases in hard drives
Like badsectors and etc
Lets suppose hard dive have many many bad sectors and data canot be accessed
Or printed circuit board need to be changed
Or anything in the firmware or slowly hdd or hdd drop operating system
These equipments is forr all hdd types western digital or seagate 

hard drives recovery tools part 1

hard drives recovery tools

Nowadays markets containing many equipments in this topic we will explain them and know the reality of it and what is the good equipment to buy and where to start
In hard drive recovery

1-chinese pc3000
hdd
                                                                                                                         Pc3000 is good for ide hard drives – do everything g-list and p-list and cut heads cut bad tracks even badsectors but with hard drive which below 160 g.b only
So this equipment isnot good nowadays as It is so bad with royl hard drives and sata
Sure this equipment if fro western digital only
This equipment consists of card and dongle and motherboard so this name is pc
2-salvation data
hard disk recovery

Salvation data is considered is the start of specific equipments I mean when you buy it you have model for western hard drives and and another for Toshiba and other fro Seagate and etc
This equipments have many features which is new like generating rom and write modules and tracks and clear g-list and clear p-list and support smart and solving passwords
It isnot like pc3000 it support all modern hard drives royl and etc
I dislike its slowly and cannot cut heads well but the company of salvation data provide buyers with updates
3-dfl

Dfl price is nearly like salvation data and dfl start which salvation ended – I talk about slowly hard drive problem and cut zones too
But both of them havenot stability
Salvation data did not can add p-list to hard drive but drl do that – some people have tricks to force it to do
Salvation data is so bad in test heads – in many times I tested it and say it is good thus if u scan it with dfl you will find that we have a bad head
But if salvation data detected that it cannot cut heads too in many cases
The biggest thing bad in the salvation data is cannot solve passwords and maks many error
Finally the support of dfl support customers with many updates in every month but salvation do that in many for a time in many months
Salvation did not support usb hard drives
Salvation data did not support royl hdd
Did not support arco which is so important to modify firmware and use it for different models in the hdd
Slow in copying any module or test it
Why salvation not good in tracks?
It reads only read tracks from 9 to 20 but rest of tracks did not – this wrong is fatal because important tracks is from 1 to 9
Not good for bad sectors
There no scan as mhdd or others program
But dfl has many features which is opposite to salvation
Determine the position of the bad sectors  on the heads without test heads
Format p-list and format g-list and excellent self scan
Reposition any module to another position without slowly or any problem
Force loading which force hdd to read service area and format serice area if you want that too
Having port sata and port data without using converters
Recovery module 47 plus ata commands
Many details about the hard drive
Generate module from track and generate rom for roly hard drives