Blog Archive

Monday, April 29, 2013

NCE 4 Best sentences for reciting

(Apr. 29, 2013 Begin)
Lesson 1: Find fossil man
1) The only way that they can preserve their history is to recount it as sagas--legends handed down from one generation of storytellers to another.
2) But the first people who were like ourselves lived so long ago that even their sagas, if they had any, are forgotten.

Lesson 2:
1) How much of each year do spiders spend killing insects?
2) Why, you may wonder, should spiders be our friends?
3) Insects would make it impossible for us to live in the world; they would devour all our crops and kill our flock and herds, if it were not for the protection we get from insect-eating animals.
4) Moreover, unlike some of the other insect eaters, spiders never do the least harm to us or our belonging.
5) Spiders are not insects, as many people think, nor even nearly related to them.
6) How many spiders are engaged in this work on our behalf?


Lesson 3:
1) Modern alpinists try to climb mountains by a route which will give them good sport, and the more difficult it is, the more highly it is regarded.
2) In the pioneering days, however, this was not the case at all. The early climbers were looking for the easiest way to the top, because the summit was the prize they sought, especially if it had never been attained before. It is true that during their explorations they often faced difficulties and dangers of the most perilous nature, equipped in a manner which would make a modern climber shudder at the thought, but they did not go out of their way to court such excitement. They had a single aim, a solitary goal--the top!
3) Invariably the background is the same: dirt and poverty, and very uncomfortable.

Lesson 4:
1) Several cases have been reported recently in Russia of people who can read and detect colors with their fingers, and even see through solid doors and walls.
2)Once case concerns an eleven-year-old schoolgirl, Vera Petrova, who has normal vision but who can also perceive things with different parts of her skin.
3) It was also found that although she could perceive things through her fingers this ability ceased the moment her hands were wet.

Lesson 5:
1)People are always talking about 'the problem of youth'. If there is one----which I take leave to doubt----then it is older people who create it, not the young themselves. Let us get down to fundamentals and agree that the young are after all human beings----people just like their elders.There is only one difference between an old man and a young one: the young man has a glorious future before him and the old one has a splendid future behind him: and maybe that is where the rub is.

Lesson 6: The sporting spirit
1) Even if one did not know from concrete examples(the 1936 Olympic Games, for instance), one can deduce it from general principles.
2) Nearly all the sports practiced nowadays are competitive.  You play to win, and the game has little meaning unless you do your utmost to win.
3) At the international level, sports is frankly mimic warfare. But the significant thing is not the behavior of the players but the attitude of the spectators  and, behind the spectators, of the nations who work themselves into furies over these absurd contests, and seriously believe----at any rate for short periods----that running, jumping and kicking a ball are tests of national virtue.

Lesson 7: Bats
1) To get a full appreciation of what this means we must turn first to some recent human inventions. Everyone  knows if he shouts in the vicinity of a wall or mountainside, an echo will come back. The further off this solid obstruction, the longer time will elapse for the return of the echo.
2) So it is a comparatively simple step from locating the sea bottom to location a shoal of fish.

Sunday, April 28, 2013

Inaugural Address (March 4, 1905) Theodore Roosevelt

USA president speech collection
http://millercenter.org/president


http://millercenter.org/president/speeches/detail/3564


不论是国家还是个人,公正和宽厚都是强者而不是弱者的表现.
当我们小心避免伤害别人时,我们也需注意自己不受到伤害_罗斯福
But justice and generosity in a nation, as in an individual, count most when shown not by the weak but by the strong. While ever careful to refrain from wrongdoing others, we must be no less insistent that we are not wronged ourselves.

Friday, April 26, 2013

Doddington's Zoo: Sheep, Goat, Lamb, Wolf




Sheep – Sheep comprise our default speaker type. In our model, sheep dominate the population and systems perform nominally well for them. 

Goats – Goats, in our model, are those speakers who are particularly difficult to recognize. Goats tend to adversely affect the performance of systems by accounting for a disproportionate share of the missed   detections. The goat population can be an especially important problem for entry control systems, where it is important that all users be reliably accepted.

Lambs – Lambs, in our model, are those speakers who are particularly easy to imitate. That is, a randomly chosen speaker is exceptionally likely to be accepted as a lamb. Lambs tend to adversely affect the performance of systems by accounting for a disproportionate share of the false alarms. This represents a
potential system weakness, if lambs can be identified, either through trial and error or through correlation with other directly observable characteristics.

Wolves – Wolves, in our model, are those speakers who are particularly successful at imitating other speakers. That is, their speech is exceptionally likely to be accepted as that of another speaker. Wolves tend to adversely affect the performance of systems by accounting for a disproportionate share of the false alarms. This represents a potential system weakness, if wolves can be identified and recruited to defeat systems.

Ref:

Doddington, George, et al. Sheep, goats, lambs and wolves: A statistical analysis of speaker performance in the NIST 1998 speaker recognition evaluation. NATIONAL INST OF STANDARDS AND TECHNOLOGY GAITHERSBURG MD, 1998.

quick tutorial on matlab parfor


Step 0:  Please make sure you

Step 1: prepare the parallel environment(suppose you have 8 cores)

matlabpool(8)

Step 2: Just replace all the parallelable "for" keyword into "parfor"
Make sure the jobs are independent each other, for example:



Ste 3: Run you script


tic

parfor i = 1:8
            c(:,i) = eig(rand(1000));
end
toc



tic

for i = 1:8
            c(:,i) = eig(rand(1000));
end


toc


%% Results:

Elapsed time is 3.718411 seconds.
Elapsed time is 5.671640 seconds


Conclusion:

The second group scripts are significantly slower than 1st group.



Reference:
http://www.mathworks.com/help/distcomp/getting-started-with-parfor.html#brb2x57

Thursday, April 25, 2013

results for a full DBN experiment



python code/DBN_small.py
Downloading data from http://www.iro.umontreal.ca/~lisa/deep/data/mnist/mnist.pkl.gz
... loading data
... building the model
... getting the pretraining functions
... pre-training the model
Pre-training layer 0, epoch 0, cost  -98.605633351
Pre-training layer 0, epoch 1, cost  -83.821740962

...
Pre-training layer 0, epoch 98, cost  -68.5052478816
Pre-training layer 0, epoch 99, cost  -68.519563826
Pre-training layer 1, epoch 0, cost  -171.643871631
Pre-training layer 1, epoch 1, cost  -149.515493823
Pre-training layer 1, epoch 2, cost  -144.451413679
Pre-training layer 1, epoch 3, cost  -141.756252063
Pre-training layer 1, epoch 4, cost  -139.878107992
Pre-training layer 1, epoch 5, cost  -138.535599652
Pre-training layer 1, epoch 6, cost  -137.402074458
Pre-training layer 1, epoch 7, cost  -136.497999148
Pre-training layer 1, epoch 8, cost  -135.716982528
Pre-training layer 1, epoch 9, cost  -135.051954135
Pre-training layer 1, epoch 10, cost  -134.447771265
Pre-training layer 1, epoch 11, cost  -133.973466419
Pre-training layer 1, epoch 12, cost  -133.551518055
Pre-training layer 1, epoch 13, cost  -133.139880511
Pre-training layer 1, epoch 14, cost  -132.788173252
Pre-training layer 1, epoch 15, cost  -132.489801008
Pre-training layer 1, epoch 16, cost  -132.237055108
Pre-training layer 1, epoch 17, cost  -131.975347485
Pre-training layer 1, epoch 18, cost  -131.771333127
Pre-training layer 1, epoch 19, cost  -131.540889258
Pre-training layer 1, epoch 20, cost  -131.384947825
Pre-training layer 1, epoch 21, cost  -131.225772659
Pre-training layer 1, epoch 22, cost  -131.070327762
Pre-training layer 1, epoch 23, cost  -130.941930783
Pre-training layer 1, epoch 24, cost  -130.778483759
Pre-training layer 1, epoch 25, cost  -130.668305024
Pre-training layer 1, epoch 26, cost  -130.555502214
Pre-training layer 1, epoch 27, cost  -130.478535277
Pre-training layer 1, epoch 28, cost  -130.364395118
Pre-training layer 1, epoch 29, cost  -130.287104187
Pre-training layer 1, epoch 30, cost  -130.210807909
Pre-training layer 1, epoch 31, cost  -130.107654161
Pre-training layer 1, epoch 32, cost  -130.028662833
Pre-training layer 1, epoch 33, cost  -129.992199401
Pre-training layer 1, epoch 34, cost  -129.88685884
Pre-training layer 1, epoch 35, cost  -129.847813521
Pre-training layer 1, epoch 36, cost  -129.786785169
Pre-training layer 1, epoch 37, cost  -129.730273604
Pre-training layer 1, epoch 38, cost  -129.697135786
Pre-training layer 1, epoch 39, cost  -129.636278175
Pre-training layer 1, epoch 40, cost  -129.618305605
Pre-training layer 1, epoch 41, cost  -129.551387057
Pre-training layer 1, epoch 42, cost  -129.49368135
Pre-training layer 1, epoch 43, cost  -129.476903705
Pre-training layer 1, epoch 44, cost  -129.425981055
Pre-training layer 1, epoch 45, cost  -129.389322047
Pre-training layer 1, epoch 46, cost  -129.362279036
Pre-training layer 1, epoch 47, cost  -129.337946502
Pre-training layer 1, epoch 48, cost  -129.304442912
Pre-training layer 1, epoch 49, cost  -129.260127775
Pre-training layer 1, epoch 50, cost  -129.221536862
Pre-training layer 1, epoch 51, cost  -129.210379139
Pre-training layer 1, epoch 52, cost  -129.169092219
Pre-training layer 1, epoch 53, cost  -129.131854654
Pre-training layer 1, epoch 54, cost  -129.133793916
Pre-training layer 1, epoch 55, cost  -129.099146517
Pre-training layer 1, epoch 56, cost  -129.087258479
Pre-training layer 1, epoch 57, cost  -129.060194224
Pre-training layer 1, epoch 58, cost  -129.027875632
Pre-training layer 1, epoch 59, cost  -128.995649811
Pre-training layer 1, epoch 60, cost  -128.977655279
Pre-training layer 1, epoch 61, cost  -128.945129528
Pre-training layer 1, epoch 62, cost  -128.940161975
Pre-training layer 1, epoch 63, cost  -128.939732915
Pre-training layer 1, epoch 64, cost  -128.941077969
Pre-training layer 1, epoch 65, cost  -128.902307804
Pre-training layer 1, epoch 66, cost  -128.868177422
Pre-training layer 1, epoch 67, cost  -128.874653731
Pre-training layer 1, epoch 68, cost  -128.867196338
Pre-training layer 1, epoch 69, cost  -128.864623846
Pre-training layer 1, epoch 70, cost  -128.843517828
Pre-training layer 1, epoch 71, cost  -128.827557045
Pre-training layer 1, epoch 72, cost  -128.79131154
Pre-training layer 1, epoch 73, cost  -128.773199697
Pre-training layer 1, epoch 74, cost  -128.769646485
Pre-training layer 1, epoch 75, cost  -128.776708309
Pre-training layer 1, epoch 76, cost  -128.759387417
Pre-training layer 1, epoch 77, cost  -128.714281256
Pre-training layer 1, epoch 78, cost  -128.71048483
Pre-training layer 1, epoch 79, cost  -128.731262644
Pre-training layer 1, epoch 80, cost  -128.701462441
Pre-training layer 1, epoch 81, cost  -128.666946237
Pre-training layer 1, epoch 82, cost  -128.682094488
Pre-training layer 1, epoch 83, cost  -128.677214794
Pre-training layer 1, epoch 84, cost  -128.66396384
Pre-training layer 1, epoch 85, cost  -128.65356571
Pre-training layer 1, epoch 86, cost  -128.6662782
Pre-training layer 1, epoch 87, cost  -128.618077204
Pre-training layer 1, epoch 88, cost  -128.607518439
Pre-training layer 1, epoch 89, cost  -128.640183712
Pre-training layer 1, epoch 90, cost  -128.619865401
Pre-training layer 1, epoch 91, cost  -128.62696618
Pre-training layer 1, epoch 92, cost  -128.599751823
Pre-training layer 1, epoch 93, cost  -128.591895957
Pre-training layer 1, epoch 94, cost  -128.554581802
Pre-training layer 1, epoch 95, cost  -128.555603175
Pre-training layer 1, epoch 96, cost  -128.565845135
Pre-training layer 1, epoch 97, cost  -128.544153488
Pre-training layer 1, epoch 98, cost  -128.548815867
Pre-training layer 1, epoch 99, cost  -128.560006264
Pre-training layer 2, epoch 0, cost  -70.0880996936
Pre-training layer 2, epoch 1, cost  -57.9429660768
Pre-training layer 2, epoch 2, cost  -55.4440582901
Pre-training layer 2, epoch 3, cost  -54.1089682447
Pre-training layer 2, epoch 4, cost  -53.2713100853
Pre-training layer 2, epoch 5, cost  -52.6138329286
Pre-training layer 2, epoch 6, cost  -52.1256050418
Pre-training layer 2, epoch 7, cost  -51.719914925
Pre-training layer 2, epoch 8, cost  -51.3744889513
Pre-training layer 2, epoch 9, cost  -51.0875345829
Pre-training layer 2, epoch 10, cost  -50.7886417596
Pre-training layer 2, epoch 11, cost  -50.5449013346
Pre-training layer 2, epoch 12, cost  -50.3558212492
Pre-training layer 2, epoch 13, cost  -50.1790201092
Pre-training layer 2, epoch 14, cost  -49.9945276798
Pre-training layer 2, epoch 15, cost  -49.8243693353
Pre-training layer 2, epoch 16, cost  -49.6824833823
Pre-training layer 2, epoch 17, cost  -49.5411691089
Pre-training layer 2, epoch 18, cost  -49.4336682825
Pre-training layer 2, epoch 19, cost  -49.3077187712
Pre-training layer 2, epoch 20, cost  -49.184447665
Pre-training layer 2, epoch 21, cost  -49.1042597712
Pre-training layer 2, epoch 22, cost  -48.9922685439
Pre-training layer 2, epoch 23, cost  -48.9134351854
Pre-training layer 2, epoch 24, cost  -48.8040250662
Pre-training layer 2, epoch 25, cost  -48.7458521848
Pre-training layer 2, epoch 26, cost  -48.6870817501
Pre-training layer 2, epoch 27, cost  -48.5984471639
Pre-training layer 2, epoch 28, cost  -48.5314310613
Pre-training layer 2, epoch 29, cost  -48.4915239528
Pre-training layer 2, epoch 30, cost  -48.4237038956
Pre-training layer 2, epoch 31, cost  -48.3788210521
Pre-training layer 2, epoch 32, cost  -48.3353179124
Pre-training layer 2, epoch 33, cost  -48.2622428232
Pre-training layer 2, epoch 34, cost  -48.2177285838
Pre-training layer 2, epoch 35, cost  -48.1769980633
Pre-training layer 2, epoch 36, cost  -48.1261212064
Pre-training layer 2, epoch 37, cost  -48.0982768957
Pre-training layer 2, epoch 38, cost  -48.0616504145
Pre-training layer 2, epoch 39, cost  -48.0000899992
Pre-training layer 2, epoch 40, cost  -47.9540308821
Pre-training layer 2, epoch 41, cost  -47.9264332891
Pre-training layer 2, epoch 42, cost  -47.896046838
Pre-training layer 2, epoch 43, cost  -47.8527590216
Pre-training layer 2, epoch 44, cost  -47.8439895524
Pre-training layer 2, epoch 45, cost  -47.7944288355
Pre-training layer 2, epoch 46, cost  -47.7490626803
Pre-training layer 2, epoch 47, cost  -47.7302210646
Pre-training layer 2, epoch 48, cost  -47.7173249508
Pre-training layer 2, epoch 49, cost  -47.6623472115
Pre-training layer 2, epoch 50, cost  -47.6406938763
Pre-training layer 2, epoch 51, cost  -47.6004131255
Pre-training layer 2, epoch 52, cost  -47.5870140934
Pre-training layer 2, epoch 53, cost  -47.5679143658
Pre-training layer 2, epoch 54, cost  -47.5403547917
Pre-training layer 2, epoch 55, cost  -47.5121944423
Pre-training layer 2, epoch 56, cost  -47.5091982126
Pre-training layer 2, epoch 57, cost  -47.4731519626
Pre-training layer 2, epoch 58, cost  -47.4387040136
Pre-training layer 2, epoch 59, cost  -47.4307169026
Pre-training layer 2, epoch 60, cost  -47.4255378595
Pre-training layer 2, epoch 61, cost  -47.3967116458
Pre-training layer 2, epoch 62, cost  -47.3831292468
Pre-training layer 2, epoch 63, cost  -47.3586719299
Pre-training layer 2, epoch 64, cost  -47.3250683865
Pre-training layer 2, epoch 65, cost  -47.3121818155
Pre-training layer 2, epoch 66, cost  -47.3253935473
Pre-training layer 2, epoch 67, cost  -47.287971487
Pre-training layer 2, epoch 68, cost  -47.2598606592
Pre-training layer 2, epoch 69, cost  -47.2452731909
Pre-training layer 2, epoch 70, cost  -47.2544936942
Pre-training layer 2, epoch 71, cost  -47.2097531117
Pre-training layer 2, epoch 72, cost  -47.2062705306
Pre-training layer 2, epoch 73, cost  -47.2004000095
Pre-training layer 2, epoch 74, cost  -47.2022627461
Pre-training layer 2, epoch 75, cost  -47.1479103501
Pre-training layer 2, epoch 76, cost  -47.1796911954
Pre-training layer 2, epoch 77, cost  -47.1508054459
Pre-training layer 2, epoch 78, cost  -47.1542654028
Pre-training layer 2, epoch 79, cost  -47.1457913565
Pre-training layer 2, epoch 80, cost  -47.1093583361
Pre-training layer 2, epoch 81, cost  -47.1113689527
Pre-training layer 2, epoch 82, cost  -47.0899995027
Pre-training layer 2, epoch 83, cost  -47.0954480194
Pre-training layer 2, epoch 84, cost  -47.072423908
Pre-training layer 2, epoch 85, cost  -47.0470443175
Pre-training layer 2, epoch 86, cost  -47.0431776517
Pre-training layer 2, epoch 87, cost  -46.9994554695
Pre-training layer 2, epoch 88, cost  -47.0142823816
Pre-training layer 2, epoch 89, cost  -46.9953037602
Pre-training layer 2, epoch 90, cost  -46.9977725268
Pre-training layer 2, epoch 91, cost  -47.0004457747
Pre-training layer 2, epoch 92, cost  -46.9652256313
Pre-training layer 2, epoch 93, cost  -46.9635554107
Pre-training layer 2, epoch 94, cost  -46.9435821098
Pre-training layer 2, epoch 95, cost  -46.9602020613
Pre-training layer 2, epoch 96, cost  -46.9421359975
Pre-training layer 2, epoch 97, cost  -46.9499591082
Pre-training layer 2, epoch 98, cost  -46.9460360187
Pre-training layer 2, epoch 99, cost  -46.9196266617
The pretraining code for file DBN.py ran for 789.28m
... getting the finetuning functions
... finetunning the model
epoch 1, minibatch 5000/5000, validation error 3.090000 %
     epoch 1, minibatch 5000/5000, test error of best model 3.450000 %
epoch 2, minibatch 5000/5000, validation error 2.530000 %
     epoch 2, minibatch 5000/5000, test error of best model 2.730000 %
epoch 3, minibatch 5000/5000, validation error 2.250000 %
     epoch 3, minibatch 5000/5000, test error of best model 2.440000 %
epoch 4, minibatch 5000/5000, validation error 2.050000 %
     epoch 4, minibatch 5000/5000, test error of best model 2.210000 %
epoch 5, minibatch 5000/5000, validation error 1.940000 %
     epoch 5, minibatch 5000/5000, test error of best model 1.930000 %
epoch 6, minibatch 5000/5000, validation error 1.820000 %
     epoch 6, minibatch 5000/5000, test error of best model 1.850000 %
epoch 7, minibatch 5000/5000, validation error 1.690000 %
     epoch 7, minibatch 5000/5000, test error of best model 1.860000 %
epoch 8, minibatch 5000/5000, validation error 1.600000 %
     epoch 8, minibatch 5000/5000, test error of best model 1.770000 %
epoch 9, minibatch 5000/5000, validation error 1.590000 %
     epoch 9, minibatch 5000/5000, test error of best model 1.730000 %
epoch 10, minibatch 5000/5000, validation error 1.550000 %
     epoch 10, minibatch 5000/5000, test error of best model 1.670000 %
epoch 11, minibatch 5000/5000, validation error 1.460000 %
     epoch 11, minibatch 5000/5000, test error of best model 1.600000 %
epoch 12, minibatch 5000/5000, validation error 1.490000 %
epoch 13, minibatch 5000/5000, validation error 1.490000 %
epoch 14, minibatch 5000/5000, validation error 1.500000 %
epoch 15, minibatch 5000/5000, validation error 1.520000 %
epoch 16, minibatch 5000/5000, validation error 1.500000 %
epoch 17, minibatch 5000/5000, validation error 1.490000 %
epoch 18, minibatch 5000/5000, validation error 1.460000 %
epoch 19, minibatch 5000/5000, validation error 1.440000 %
     epoch 19, minibatch 5000/5000, test error of best model 1.520000 %
epoch 20, minibatch 5000/5000, validation error 1.430000 %
     epoch 20, minibatch 5000/5000, test error of best model 1.510000 %
epoch 21, minibatch 5000/5000, validation error 1.440000 %
epoch 22, minibatch 5000/5000, validation error 1.440000 %
epoch 23, minibatch 5000/5000, validation error 1.430000 %
epoch 24, minibatch 5000/5000, validation error 1.440000 %
epoch 25, minibatch 5000/5000, validation error 1.440000 %
epoch 26, minibatch 5000/5000, validation error 1.440000 %
epoch 27, minibatch 5000/5000, validation error 1.440000 %
epoch 28, minibatch 5000/5000, validation error 1.450000 %
epoch 29, minibatch 5000/5000, validation error 1.440000 %
epoch 30, minibatch 5000/5000, validation error 1.440000 %
epoch 31, minibatch 5000/5000, validation error 1.410000 %
     epoch 31, minibatch 5000/5000, test error of best model 1.410000 %
epoch 32, minibatch 5000/5000, validation error 1.400000 %
     epoch 32, minibatch 5000/5000, test error of best model 1.420000 %
epoch 33, minibatch 5000/5000, validation error 1.400000 %
epoch 34, minibatch 5000/5000, validation error 1.400000 %
epoch 35, minibatch 5000/5000, validation error 1.400000 %
epoch 36, minibatch 5000/5000, validation error 1.400000 %
epoch 37, minibatch 5000/5000, validation error 1.390000 %
     epoch 37, minibatch 5000/5000, test error of best model 1.390000 %
epoch 38, minibatch 5000/5000, validation error 1.380000 %
     epoch 38, minibatch 5000/5000, test error of best model 1.400000 %
epoch 39, minibatch 5000/5000, validation error 1.380000 %
epoch 40, minibatch 5000/5000, validation error 1.380000 %
epoch 41, minibatch 5000/5000, validation error 1.370000 %
     epoch 41, minibatch 5000/5000, test error of best model 1.380000 %
epoch 42, minibatch 5000/5000, validation error 1.360000 %
     epoch 42, minibatch 5000/5000, test error of best model 1.370000 %
epoch 43, minibatch 5000/5000, validation error 1.360000 %
epoch 44, minibatch 5000/5000, validation error 1.360000 %
epoch 45, minibatch 5000/5000, validation error 1.360000 %
epoch 46, minibatch 5000/5000, validation error 1.360000 %
epoch 47, minibatch 5000/5000, validation error 1.360000 %
epoch 48, minibatch 5000/5000, validation error 1.360000 %
epoch 49, minibatch 5000/5000, validation error 1.360000 %
epoch 50, minibatch 5000/5000, validation error 1.360000 %
epoch 51, minibatch 5000/5000, validation error 1.370000 %
epoch 52, minibatch 5000/5000, validation error 1.370000 %
epoch 53, minibatch 5000/5000, validation error 1.380000 %
epoch 54, minibatch 5000/5000, validation error 1.380000 %
epoch 55, minibatch 5000/5000, validation error 1.370000 %
epoch 56, minibatch 5000/5000, validation error 1.370000 %
epoch 57, minibatch 5000/5000, validation error 1.360000 %
epoch 58, minibatch 5000/5000, validation error 1.360000 %
epoch 59, minibatch 5000/5000, validation error 1.370000 %
epoch 60, minibatch 5000/5000, validation error 1.370000 %
epoch 61, minibatch 5000/5000, validation error 1.360000 %
epoch 62, minibatch 5000/5000, validation error 1.350000 %
     epoch 62, minibatch 5000/5000, test error of best model 1.300000 %
epoch 63, minibatch 5000/5000, validation error 1.350000 %
epoch 64, minibatch 5000/5000, validation error 1.360000 %
epoch 65, minibatch 5000/5000, validation error 1.360000 %
epoch 66, minibatch 5000/5000, validation error 1.360000 %
epoch 67, minibatch 5000/5000, validation error 1.350000 %
epoch 68, minibatch 5000/5000, validation error 1.340000 %
     epoch 68, minibatch 5000/5000, test error of best model 1.290000 %
epoch 69, minibatch 5000/5000, validation error 1.340000 %
epoch 70, minibatch 5000/5000, validation error 1.340000 %
epoch 71, minibatch 5000/5000, validation error 1.340000 %
epoch 72, minibatch 5000/5000, validation error 1.340000 %
epoch 73, minibatch 5000/5000, validation error 1.340000 %
epoch 74, minibatch 5000/5000, validation error 1.340000 %
epoch 75, minibatch 5000/5000, validation error 1.340000 %
epoch 76, minibatch 5000/5000, validation error 1.330000 %
     epoch 76, minibatch 5000/5000, test error of best model 1.300000 %
epoch 77, minibatch 5000/5000, validation error 1.330000 %
epoch 78, minibatch 5000/5000, validation error 1.330000 %
epoch 79, minibatch 5000/5000, validation error 1.330000 %
epoch 80, minibatch 5000/5000, validation error 1.330000 %
epoch 81, minibatch 5000/5000, validation error 1.330000 %
epoch 82, minibatch 5000/5000, validation error 1.320000 %
     epoch 82, minibatch 5000/5000, test error of best model 1.310000 %
epoch 83, minibatch 5000/5000, validation error 1.320000 %
epoch 84, minibatch 5000/5000, validation error 1.320000 %
epoch 85, minibatch 5000/5000, validation error 1.310000 %
     epoch 85, minibatch 5000/5000, test error of best model 1.300000 %
epoch 86, minibatch 5000/5000, validation error 1.310000 %
epoch 87, minibatch 5000/5000, validation error 1.300000 %
     epoch 87, minibatch 5000/5000, test error of best model 1.300000 %
epoch 88, minibatch 5000/5000, validation error 1.300000 %
epoch 89, minibatch 5000/5000, validation error 1.300000 %
epoch 90, minibatch 5000/5000, validation error 1.300000 %
epoch 91, minibatch 5000/5000, validation error 1.300000 %
epoch 92, minibatch 5000/5000, validation error 1.300000 %
epoch 93, minibatch 5000/5000, validation error 1.300000 %
epoch 94, minibatch 5000/5000, validation error 1.300000 %
epoch 95, minibatch 5000/5000, validation error 1.300000 %
epoch 96, minibatch 5000/5000, validation error 1.300000 %
epoch 97, minibatch 5000/5000, validation error 1.300000 %
epoch 98, minibatch 5000/5000, validation error 1.300000 %
epoch 99, minibatch 5000/5000, validation error 1.300000 %
epoch 100, minibatch 5000/5000, validation error 1.300000 %
epoch 101, minibatch 5000/5000, validation error 1.300000 %
epoch 102, minibatch 5000/5000, validation error 1.300000 %
epoch 103, minibatch 5000/5000, validation error 1.300000 %
epoch 104, minibatch 5000/5000, validation error 1.300000 %
epoch 105, minibatch 5000/5000, validation error 1.300000 %
epoch 106, minibatch 5000/5000, validation error 1.290000 %
     epoch 106, minibatch 5000/5000, test error of best model 1.290000 %
epoch 107, minibatch 5000/5000, validation error 1.280000 %
     epoch 107, minibatch 5000/5000, test error of best model 1.290000 %
epoch 108, minibatch 5000/5000, validation error 1.280000 %
epoch 109, minibatch 5000/5000, validation error 1.290000 %
epoch 110, minibatch 5000/5000, validation error 1.290000 %
epoch 111, minibatch 5000/5000, validation error 1.290000 %
epoch 112, minibatch 5000/5000, validation error 1.290000 %
epoch 113, minibatch 5000/5000, validation error 1.290000 %
epoch 114, minibatch 5000/5000, validation error 1.290000 %
epoch 115, minibatch 5000/5000, validation error 1.290000 %
epoch 116, minibatch 5000/5000, validation error 1.290000 %
epoch 117, minibatch 5000/5000, validation error 1.290000 %
epoch 118, minibatch 5000/5000, validation error 1.290000 %
epoch 119, minibatch 5000/5000, validation error 1.290000 %
epoch 120, minibatch 5000/5000, validation error 1.290000 %
epoch 121, minibatch 5000/5000, validation error 1.290000 %
epoch 122, minibatch 5000/5000, validation error 1.290000 %
epoch 123, minibatch 5000/5000, validation error 1.290000 %
epoch 124, minibatch 5000/5000, validation error 1.290000 %
epoch 125, minibatch 5000/5000, validation error 1.290000 %
epoch 126, minibatch 5000/5000, validation error 1.290000 %
epoch 127, minibatch 5000/5000, validation error 1.290000 %
epoch 128, minibatch 5000/5000, validation error 1.300000 %
epoch 129, minibatch 5000/5000, validation error 1.300000 %
epoch 130, minibatch 5000/5000, validation error 1.300000 %
epoch 131, minibatch 5000/5000, validation error 1.300000 %
epoch 132, minibatch 5000/5000, validation error 1.300000 %
epoch 133, minibatch 5000/5000, validation error 1.300000 %
epoch 134, minibatch 5000/5000, validation error 1.300000 %
epoch 135, minibatch 5000/5000, validation error 1.300000 %
epoch 136, minibatch 5000/5000, validation error 1.300000 %
epoch 137, minibatch 5000/5000, validation error 1.300000 %
epoch 138, minibatch 5000/5000, validation error 1.300000 %
epoch 139, minibatch 5000/5000, validation error 1.300000 %
epoch 140, minibatch 5000/5000, validation error 1.300000 %
epoch 141, minibatch 5000/5000, validation error 1.300000 %
epoch 142, minibatch 5000/5000, validation error 1.300000 %
epoch 143, minibatch 5000/5000, validation error 1.300000 %
epoch 144, minibatch 5000/5000, validation error 1.300000 %
epoch 145, minibatch 5000/5000, validation error 1.300000 %
epoch 146, minibatch 5000/5000, validation error 1.300000 %
epoch 147, minibatch 5000/5000, validation error 1.300000 %
epoch 148, minibatch 5000/5000, validation error 1.300000 %
epoch 149, minibatch 5000/5000, validation error 1.300000 %
epoch 150, minibatch 5000/5000, validation error 1.300000 %
epoch 151, minibatch 5000/5000, validation error 1.300000 %
epoch 152, minibatch 5000/5000, validation error 1.300000 %
epoch 153, minibatch 5000/5000, validation error 1.300000 %
epoch 154, minibatch 5000/5000, validation error 1.300000 %
epoch 155, minibatch 5000/5000, validation error 1.300000 %
epoch 156, minibatch 5000/5000, validation error 1.300000 %
epoch 157, minibatch 5000/5000, validation error 1.300000 %
epoch 158, minibatch 5000/5000, validation error 1.300000 %
epoch 159, minibatch 5000/5000, validation error 1.300000 %
epoch 160, minibatch 5000/5000, validation error 1.300000 %
epoch 161, minibatch 5000/5000, validation error 1.300000 %
epoch 162, minibatch 5000/5000, validation error 1.300000 %
epoch 163, minibatch 5000/5000, validation error 1.300000 %
epoch 164, minibatch 5000/5000, validation error 1.300000 %
epoch 165, minibatch 5000/5000, validation error 1.300000 %
epoch 166, minibatch 5000/5000, validation error 1.300000 %
epoch 167, minibatch 5000/5000, validation error 1.300000 %
epoch 168, minibatch 5000/5000, validation error 1.300000 %
epoch 169, minibatch 5000/5000, validation error 1.300000 %
epoch 170, minibatch 5000/5000, validation error 1.300000 %
epoch 171, minibatch 5000/5000, validation error 1.300000 %
epoch 172, minibatch 5000/5000, validation error 1.300000 %
epoch 173, minibatch 5000/5000, validation error 1.300000 %
epoch 174, minibatch 5000/5000, validation error 1.300000 %
epoch 175, minibatch 5000/5000, validation error 1.300000 %
epoch 176, minibatch 5000/5000, validation error 1.300000 %
epoch 177, minibatch 5000/5000, validation error 1.300000 %
epoch 178, minibatch 5000/5000, validation error 1.300000 %
epoch 179, minibatch 5000/5000, validation error 1.300000 %
epoch 180, minibatch 5000/5000, validation error 1.300000 %
epoch 181, minibatch 5000/5000, validation error 1.300000 %
epoch 182, minibatch 5000/5000, validation error 1.300000 %
epoch 183, minibatch 5000/5000, validation error 1.290000 %
epoch 184, minibatch 5000/5000, validation error 1.280000 %
epoch 185, minibatch 5000/5000, validation error 1.280000 %
epoch 186, minibatch 5000/5000, validation error 1.280000 %
epoch 187, minibatch 5000/5000, validation error 1.280000 %
epoch 188, minibatch 5000/5000, validation error 1.280000 %
epoch 189, minibatch 5000/5000, validation error 1.280000 %
epoch 190, minibatch 5000/5000, validation error 1.280000 %
epoch 191, minibatch 5000/5000, validation error 1.280000 %
epoch 192, minibatch 5000/5000, validation error 1.280000 %
epoch 193, minibatch 5000/5000, validation error 1.280000 %
epoch 194, minibatch 5000/5000, validation error 1.280000 %
epoch 195, minibatch 5000/5000, validation error 1.280000 %
epoch 196, minibatch 5000/5000, validation error 1.280000 %
epoch 197, minibatch 5000/5000, validation error 1.280000 %
epoch 198, minibatch 5000/5000, validation error 1.280000 %
epoch 199, minibatch 5000/5000, validation error 1.280000 %
epoch 200, minibatch 5000/5000, validation error 1.280000 %
epoch 201, minibatch 5000/5000, validation error 1.280000 %
epoch 202, minibatch 5000/5000, validation error 1.280000 %
epoch 203, minibatch 5000/5000, validation error 1.280000 %
epoch 204, minibatch 5000/5000, validation error 1.280000 %
epoch 205, minibatch 5000/5000, validation error 1.280000 %
epoch 206, minibatch 5000/5000, validation error 1.280000 %
epoch 207, minibatch 5000/5000, validation error 1.280000 %
epoch 208, minibatch 5000/5000, validation error 1.280000 %
epoch 209, minibatch 5000/5000, validation error 1.280000 %
epoch 210, minibatch 5000/5000, validation error 1.280000 %
epoch 211, minibatch 5000/5000, validation error 1.280000 %
epoch 212, minibatch 5000/5000, validation error 1.280000 %
epoch 213, minibatch 5000/5000, validation error 1.280000 %
Optimization complete with best validation score of 1.280000 %,with test performance 1.290000 %
The fine tuning code for file DBN.py ran for 648.83m
Thu Apr 25 06:23:01 CDT 2013

recipe for NIST SRE 2014

Step 0:  Double check you have a better speech audio quality, if you can:
Instead of using the lossy (μlaw) coding, use full-bandwidth version of microphone data (if available) [1]



Step 1: Feature Extraction: Using i-Vector system

Step 2: Back-end Classification: Multi-session back-end [2]




Reference: 

[1] Stolcke, Andreas, and Martin Graciarena Luciana Ferrer. "Effects of audio and ASR quality on cepstral and high-level speaker verification systems." Odyssey 2012-The Speaker and Language Recognition Workshop. 2012.[PDF]

[2] G Liu, T Hasan, H Boril, JHL Hansen, "An investigation on back-end for speaker recognition in multi-session enrollment",Proc. IEEE ICASSP2013, Vancouver, Canada, [PDF]

Wednesday, April 24, 2013

results for a small DBN experiment




python code/DBN_small.py
Downloading data from http://www.iro.umontreal.ca/~lisa/deep/data/mnist/mnist.pkl.gz
... loading data
... building the model
... getting the pretraining functions
... pre-training the model
Pre-training layer 0, epoch 0, cost  -98.605633351
Pre-training layer 0, epoch 1, cost  -83.821740962
Pre-training layer 0, epoch 2, cost  -80.7250660333
Pre-training layer 0, epoch 3, cost  -79.0545566378
Pre-training layer 0, epoch 4, cost  -77.9373883434
Pre-training layer 0, epoch 5, cost  -77.0672617796
Pre-training layer 0, epoch 6, cost  -76.4264764766
Pre-training layer 0, epoch 7, cost  -75.8230576646
Pre-training layer 0, epoch 8, cost  -75.3795806083
Pre-training layer 0, epoch 9, cost  -74.9426026512
Pre-training layer 1, epoch 0, cost  -259.323603643
Pre-training layer 1, epoch 1, cost  -234.892072755
Pre-training layer 1, epoch 2, cost  -229.839405554
Pre-training layer 1, epoch 3, cost  -227.188812758
Pre-training layer 1, epoch 4, cost  -225.417284897
Pre-training layer 1, epoch 5, cost  -224.139577706
Pre-training layer 1, epoch 6, cost  -223.164434731
Pre-training layer 1, epoch 7, cost  -222.394942526
Pre-training layer 1, epoch 8, cost  -221.768370618
Pre-training layer 1, epoch 9, cost  -221.278308113
Pre-training layer 2, epoch 0, cost  -76.0649870035
Pre-training layer 2, epoch 1, cost  -64.5806778821
Pre-training layer 2, epoch 2, cost  -62.436519382
Pre-training layer 2, epoch 3, cost  -61.3510303461
Pre-training layer 2, epoch 4, cost  -60.6772809506
Pre-training layer 2, epoch 5, cost  -60.2360054935
Pre-training layer 2, epoch 6, cost  -59.810636797
Pre-training layer 2, epoch 7, cost  -59.5407355314
Pre-training layer 2, epoch 8, cost  -59.3057561615
Pre-training layer 2, epoch 9, cost  -59.0920642013
The pretraining code for file DBN_small.py ran for 79.22m
... getting the finetuning functions
... finetunning the model
epoch 1, minibatch 5000/5000, validation error 3.820000 %
     epoch 1, minibatch 5000/5000, test error of best model 4.390000 %
epoch 2, minibatch 5000/5000, validation error 3.070000 %
     epoch 2, minibatch 5000/5000, test error of best model 3.510000 %
epoch 3, minibatch 5000/5000, validation error 2.710000 %
     epoch 3, minibatch 5000/5000, test error of best model 3.010000 %
epoch 4, minibatch 5000/5000, validation error 2.460000 %
     epoch 4, minibatch 5000/5000, test error of best model 2.640000 %
epoch 5, minibatch 5000/5000, validation error 2.200000 %
     epoch 5, minibatch 5000/5000, test error of best model 2.450000 %
epoch 6, minibatch 5000/5000, validation error 2.130000 %
     epoch 6, minibatch 5000/5000, test error of best model 2.230000 %
epoch 7, minibatch 5000/5000, validation error 2.050000 %
     epoch 7, minibatch 5000/5000, test error of best model 2.120000 %
epoch 8, minibatch 5000/5000, validation error 1.980000 %
     epoch 8, minibatch 5000/5000, test error of best model 2.060000 %
epoch 9, minibatch 5000/5000, validation error 2.000000 %
epoch 10, minibatch 5000/5000, validation error 1.980000 %
     epoch 10, minibatch 5000/5000, test error of best model 1.890000 %
Optimization complete with best validation score of 1.980000 %,with test performance 1.890000 %
The fine tuning code for file DBN_small.py ran for 32.26m

Note: A full DBN experiment will give result like:

Optimization complete with best validation score of 1.280000 %,with test performance 1.290000 %
The fine tuning code for file DBN.py ran for 648.83m

[SOLVED] Can't install nVidia drivers on Ubuntu 10.10

[SOLVED] Can't install nVidia drivers on Ubuntu 10.10: "sudo service gdm start"

'via Blog this'

Restricted Boltzmann Machines (RBM) — DeepLearning v0.1 documentation

Restricted Boltzmann Machines (RBM) — DeepLearning v0.1 documentation:

'via Blog this'

We owe you so much more than just honoring you on days of gried and celebration

We owe you so much more than just honoring you on days of grief and celebration

Noah's Ark Lab | From Big Data to Deep Knowledge

Noah's Ark Lab | From Big Data to Deep Knowledge:

'via Blog this'

Internship at Noah’s Ark Lab (2013) | Noah's Ark Lab

Internship at Noah’s Ark Lab (2013) | Noah's Ark Lab:

'via Blog this'

Tuesday, April 23, 2013

How to install NumPy on Python in Ubuntu 12.04

Andrew!: Installing NumPy 1.6 on Python 2.7 in Ubuntu 12.04: "type(array([1,2,4,5]))"

'via Blog this'

Generative Classifier vs. Discriminative Classifier


Generative (e.g., naïve bayes, GMM)

  • Assume some functional form P(X|Y), P(Y)
  • This is the ‘generative’ model
  • Estimate parameters of P(X|Y),P(Y) directly from training data
  • Use bayes rule to calculate P(Y|x=xi)


Discriminative  (e.g., SVM, Linear Regression, LDA)

  • Assume some functional form for P(Y|X)
  • This is the ‘discriminative’ model
  • Estimate parameters of P(Y|X) directly from data



Examples:


Generative Model:

Gaussian mixture model and other types of mixture model
Hidden Markov model
Probabilistic context-free grammar
Naive Bayes
Averaged one-dependence estimators
Latent Dirichlet allocation
Restricted Boltzmann machine
Probabilistic Linear discriminant analysis (PLDA)

Discriminative Model:

Logistic regression
Support vector machines
Boosting (meta-algorithm)
Conditional random fields
Linear regression
Neural networks
Linear discriminant analysis (LDA)



What should you do when you just finish a fresh installation of Ubuntu


In stall SSH server
https://help.ubuntu.com/10.04/serverguide/openssh-server.html

Monday, April 22, 2013

Stochastic Optimization - Cross Entropy Visualization

http://www.youtube.com/watch?v=tNAIHEse7Ms

seven habits reading note


Leadership and management (p53)

Management is a bottom-line focus: How can I best accomplish certain things? Leadership
deals with the top line: What are the things I want to accomplish? In the words of both Peter
Drucker and Warren Bennis, "Management is doing things right; leadership is doing the right
things." Management is efficiency in climbing the ladder of success; leadership determines
whether the ladder is leaning against the right wall.

Friday, April 19, 2013

How to change google site template

Step 1): Backup your old google site
Step 2): Enter the management mode, then delete your current google site
Step 3): Select your favorite google site template.

How to create a google web site in 1 minute.


Step 1: log in or register in:
https://sites.google.com/?pli=1


Step 2:

Fill in the information and select your favorite template.

Step 3:

You can always edit the contents after you log in with you Goolge account.


Enjoy.

Thursday, April 18, 2013

excel表格中斜线怎么画斜线 How to draw diagonal line in EXCEL


Reference:

答:在excel表格中画斜线,如果只是一对角线,可以用格式的线来完成。方法是:
    单元格右键菜单--设置单元格格式---边线,在左侧的绘对角线即可。如下图所示

     电子表格中画多个线可以用绘画工具栏中的直线,然后用标签控件输入文字,下面是原创的操作动画.

Wednesday, April 17, 2013

In Google Drive, how can you link directly to “Download” a zip file and not view the contents?

Reference:
http://webapps.stackexchange.com/questions/30654/in-google-drive-how-can-you-link-directly-to-download-a-zip-file-and-not-view

For the moment, you have to build the URL manually.


https://docs.google.com/uc?export=download&id=YourIndividualID
Where YourIndividualID is the ID of the respective document (zip file). You get it either from the URL or by clicking File → Share and copying it from the sharing URL.
For example, you may get something like this from google doc:
https://docs.google.com/file/d/YourIndividualID/edit?usp=sharing

Tuesday, April 16, 2013

DNN resources and link and demo


Matlab based:
http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf


Python based
Dr Hinton's website is good place to start.
But his code is based on Matlab.
I started with Theano (python based).
Here is some useful link:
http://deeplearning.net/

You can also started with Theano (python based).

Quick Start:


git clone git://github.com/Theano/Theano.git

cd Theano
sudo python setup.py develop
python --version
sudo apt-get install python-numpy python-scipy
python -c "import scipy; print scipy.version.version"



git clone git://github.com/lisa-lab/DeepLearningTutorials.git


cd DeepLearningTutorials/data
bash download.sh
cd ../code
python rbm.py

How to prepare data format in the DNN format:

http://docs.python.org/2/library/pickle.html

Thursday, April 11, 2013

about citation



Handling self-citations using Google Scholar


http://cybermetrics.cindoc.csic.es/articles/v13i1p2.html




Does Online Availability Increase Citations? 
Theory and Evidence from a Panel of Economics and Business Journals


http://mccabe.people.si.umich.edu/McCabe_Snyder_Revised_3_2013.pdf

Monday, April 8, 2013

Rover toolkit


0) Abstract:

Rover is a tool combine hypothesized word outputs of multiple recognition systems and select the best scoring word sequence. Rover is part of the NIST SCTK Scoring Tookit. A number of different output formats can be generated and different scoring functions can be specified. A more complete description of the rover system can be found in the paper A post-processing system to yield reduced word error rates: Recognizer Output Voting Error Reduction (ROVER).

1) code:
              ftp://jaguar.ncsl.nist.gov/pub/sctk-2.4.0-20091110-0958.tar.bz2

2) intro:
               http://www1.icsi.berkeley.edu/Speech/docs/sctk-1.2/rover.htm
3) Ref:
               J. Fiscus, “A post-processing system to yield reduced word error rates: Recognizer output voting error reduction (ROVER),” in IEEE Workshop on Automatic Speech Recognition and Understanding, Dec. 1997, pp. 347 –354.

立遗嘱记 以及 IEEE Term Life Insurance

source: http://youzinet.com/bbs/viewthread.php?tid=133879
by lepton  

好像也没那么久以前,我们每天想的就是怎么玩玩玩。可是有一天,我们要做爸爸妈妈了。老公决定去找一个投资顾问,好好把我们的财产整理一下。为了肚子里的贝贝,我们一夜之间被逼成为要负责的成年人。

投资顾问建议我们立一份遗嘱。我们从网上载下来那种最简单的,在证人面前签了字就算完成任务了。

我们有个朋友,父亲是医生,家境富裕,可是在她十岁时,父亲在一次钓鱼意外中沉船溺毙。之后十多年她妈妈很辛苦地做餐馆女侍把儿女拉拔大,到我们做事的时候,她还在餐馆做。我的朋友流着泪告诉我,一定要做好遗嘱,照顾好家人。

老公是个很谨慎的人,他说保护妻小是他的首要职责。虽然我一直也有一份不错的收入,他还是要确保如果他有什么不幸,我不需要工作一家人也可以生活无虞。为此他买了两种人寿保险。第一种叫Variable Universal Life Insurance,是投资的一种,以后可以拿回来的。不过这种保险并不适合每个家庭,因为前面七年完全不能拿出来,而且成本比较高。好处是你拿出来的时候不算税的。

第二种,就是最普遍的term life。IEEE的人寿保险term life很便宜,我们两个保两百万一年才两百多大洋,实在是很划算。

贝贝一岁半时,我的肚子又一次大起来,这一次,因为有两个孩子了,我们的人寿保险也超过遗产法免税的两百万上限,老公提出要好好做一个完整的遗嘱。他读了很多资料,然后拉着不大情愿的我,在城里四处面谈专门做遗产规划的律师。最后我们选了一个女律师,很专业,人也很亲和。

我们和她谈了好几次,把自己全部财产和盘托出,回家做了很多功课,也为了争论该指定谁来做什么,把对方家里的脏衣服抖出来,几乎吵起来。“不能让你爸管钱,他都输了那么多股票。”“你爸也输了啊。我爸至少比你妈好。”好不容易在所有的安排上达到共识。

这次立的遗嘱很详尽全面。除了交代谁来做执行人,谁来做监护人,谁来做受托人,谁来管钱,财产怎么分配,小孩子几岁拿多少钱,大学毕业多少奖金,等等,还有立信托,这样就不怕政府打重重的遗产税。

还有一些也很重要的文件
* Statutory Durable Power of Attorney
* Medical Power of Attorney
* Living Will

要签遗嘱头一天,宝宝却提前来报道了,去医院第一个通知的人就是我们律师,结果只好月子完去签字。签了好久,中间我还在律师证人一等人面前一手喂奶,一手签字又起誓,样子颇狼狈。最后我们一共花了快一千大洋吧。签了以后,拿着沉沉的文件夹,觉得自己又一次长大了;同时,也觉得如释重负,小朋友的未来终于有最好的安排。

等我休完六个月回去上班时,同组的一对印度夫妇正开始休长假,大家还以为他们回印度了。可是老板很沉痛地宣布,我的印度同事得了血癌,夫妻两个现在已经去休士顿癌症医院治疗。这个爸爸才三十四岁,有一个五岁的女儿和一岁的儿子。我们听到这个噩耗,真如晴天霹雳。大家都自发去登记捐骨髓,想方设法帮他们。那些天里,我们每次经过他的cube,都好难过。不幸的是,虽然找到了相配的骨髓,可是他的身体已经不能接受,在发现病情的四个月后就与世长辞。

我的同事,这个印度太太在葬礼上哭成了泪人。两个孩子还这么小,他们去年才刚刚拿了绿卡,买了房子,本来前景一片灿烂。更惨的是,现实所逼,葬礼才过一个礼拜,她就回来上班,强打起精神开始工作。我简直不能想象,如果自己的先生有什么不幸,我怎么能一个礼拜之后就回到曾经天天一起共事的地方做这些不再有什么意义的工作?之后我们当然都对她很关心,我常常问她一些小孩的事,也邀她一起去做瑜伽。可是每每在走廊里看到她,她还是看起来那么愁苦落寞。我想,她如果有选择,绝对宁愿去别的地方重新开始,至少,先给自己一段时间哀恸。

一年快过去了,前些天为了更改遗嘱又和律师见面。做完正事之后,我和律师谈起这个同事。令我惊讶的是,原来他们以前也找过我们律师谈。可是谈过后迟迟没有行动,一直说他们太忙了,没有真正上心去买保险和做安排。结果先生得病后,他们立刻回头找律师。可惜,很多事都太晚了。

我为了他们的不幸哀叹。我的白人邻居和朋友,看起来很粗养孩子们,可是个个都早早的做好遗产安排。相反,中国人通常对小孩overprotective,可是却不肯花钱花时间做好对家人孩子来说至关重要的这件事。总要等看到身边有灾祸发生,才会受到冲击采取行动。就像我同事生病后,公司好多人抢着买人寿保险。唉,人生的不测哪里是可以预料的?我们能够做的不过是尽量照顾好家人未来的生活。但愿这一千块是白白花了,我和先生可以一起开心地享受小朋友长大的过程,珍惜每一天如一个美丽礼物。

Sunday, April 7, 2013

如何提高论文引用率



论文引用率是反映论文质量的一个重要指标(特殊的情况也是糟糕程度的指标),当然如果你的研究足够好,即使论文写的不怎么样,也不需要太在乎什么引用率。只不过是对大多数科研人员来讲,论文数量、杂志的档次和引用次数仍是让人非常揪心的问题。最近看到科学网上有老师说《论文引用并不重要》,这确实是一个很好的回答。但就目前我们周围的情况看,能把引用看的更重要也是不小的进步。因为把论文引用作为指标要比论文计数和统计影响因子指数要好许多。
大家对如何更多发表论文,如何发表在更高影响杂志上有许多讨论。主要的原因是国内大部分科研单位的评价模式是把这些作为最重要指标,到目前为至,大部分都不太注重论文本身的引用情况,更别说论文本身的真实质量。从论文数量、杂志档次和引用情况这三种指标看,最接近能如实反映论文质量的是引用情况。每年SCI作为预测诺贝尔奖的最重要指标就是根据个人引用情况分析。我国的科技发展纲要也明确提出这个问题,就是在不久的将来要把提高论文引用作为重要的国家目标。现在结合个人的体验,从一些局部谈谈这个问题。以作为抛砖引玉。
首先,提高研究水平是提高引用的最重要前提。
或者说尽量把高质量的文章发表在高档次杂志上,是有效提高引用的重要手段。论文引用文献大家都有个习惯,就是大部分作者们都喜欢引用比自己论文档次更高的论文为文献。这非常容易让人理解,因为引用文献是为说明自己研究的合理性和价值,就象我们平时写普通文章、说话聊天时,人们也总是喜欢引用一些比较著名的人物说的名言。文献引用的规律当然符合这个人之常情。提高研究水平当然是更容易发表在本专业更高档次的杂志上。人们在选择文献的时候,一般是按照这个规律:查找到符合自己需要的前人的基础。这个基础可能比较多,有的人选择把所有的文献都引,有的人只选择那些一般认为档次高的,如何判断档次高低,一般是看杂志的知名度,大部分学者都对自己领域的杂志有个基本的分类。最终选择高档次杂志那简直是非常自然的事。
其次,要注意选择有引用潜力的研究课题。
过去国内因为评价模式的问题,大家都把精力集中在如何发表高档次杂志方面,出现了许多烧钱论文,就是跟踪热点,大量花钱,发表高档杂志。这类论文引用情况一般也不错,因为毕竟杂志不错。但对比同类杂志其他论文来讲,相对的引用并不高,因为许多并不是什么开创性的工作,跟踪的论文是很难有高引用价值的。要选择具有引用潜力的研究课题,并不是要有多么高深的技巧。只需要注意这个特点就可以。主要是要判断这个课题是否会有别人模仿,这一点非常重要。方法就是要决定开展一个课题前,反复提问自己是否愿意进行持续性或扩展性研究,如果一个工作自己都不想继续深入或扩展到其他相关领域,这样的课题就属于末端性研究,也许课题设计严格,故事全面生动,发表一个好杂志没有问题,但相信也不会有更多引用。
再次,论文的题目、摘要和关键词都特别关键。
一般来讲,科研论文被阅读最多的是题目,然后是摘要。一等要让人能从题目中看到一篇论文的全面,除非是综述性论文,千万不要太大,让人看不出你的研究内容。这其实也是一般论文写作的最重要要求,但就是许多人不注意这个问题。我过去有几篇论文,研究的内容和水平可以说不相上下,但其中有一个的引用特别高,后来我逐渐发现。在进行相关主题检索的时候,这个论文几乎从来没有被漏过,也就是说,给读者的接口非常友好。设想一下,你的论文被人看到的几率大,那么被误引用的几率也会大增。因为我发现确实有一些不太相关的文章也引用了这个文章。
最后,要注意一些细节。
宣传自己,最好的方式是经常写自己领域的综述性文章。综述性文章本身的引用比较多,这是一个非常重要的、正规的、具有持续性特点学术宣传手段,因为综述会引用大量文献,许多学者非常注重个人文章被人引用的情况,这些综述自然也容易成为他们的阅读目录,通过这个文章他们可能会注意到你,注意到你的研究论文。也可以选择更简单的方式,例如你在自己的博客中对个人的研究进行详细介绍,用比较简单的语言把自己的研究告诉给更多潜在的引用客户。因为靠个人和一个小组,不可能对自己领域的所有问题都解决,一些相关的领域又不能一下了解你的领域和进展。例如,如果你是做肿瘤方面的,这个领域在国际上都比较前沿,特别是关于细胞信号分子的一些工作。但做心血管方面的一下又不了解,你的发现也许对心血管方面有一定借鉴。那么通过一些科普性介绍,可能把那些可能对你的研究有兴趣的不同领域的学者吸引过来。当然,学术宣传的手段有许多,积极参加本学科的学术会议,认真利用各种机会介绍自己的工作都是非常重要的。另外是加强学术合作,最好是能与本领域最厉害的人合作。有人说对大部分人来讲,这简直是天方夜谭。我要说的是这不是什么特别难的事。如果你觉得自己水平并不太高,可以反复与那些大牛教授进行讨论和交流,适当的时候提出你的研究获得他许多帮助,可否请他作为共同作者。一般情况下,即使著名教授也不会拒绝在自己有一定贡献的论文上署名。

Saturday, April 6, 2013

千人计划为什么没有吸引更多的海外学者?



    千人计划实行以来,吸引了一些海外华人学者归国效力,但一般认为不是特别成功。海外华人教授聚会时,是不是回国是一个永久话题。这几年虽然千人计划提供了一个好平台,但似乎更多学者倾向于不回了。为什么呢?当然原因很多,而且对不同人有不同原因。我这里说一点原因,估计主要适用于美欧名校正教授(也就是学术型千人的定位)。

    这一点可以用 Steve Jobs (史蒂夫 乔布斯)之问来概括:“Do you want to sell sugar water for the rest of your life or do you want to change the world?” (是卖糖水还是改变世界?) 也许改变世界的口气太大了,另一种说法是make a difference。 如果回国后只是做一些在美国也能做的事,也就是说只不过是卖糖水,吸引力就不大了。如果归国后能像当初春秋战国时人才流动一样,推动体制建设和国家发展,谱写历史,肯定有很多人愿意做。即使只是在小范围内做一些有意义的事,也会有不少人愿意。对于一些人来说,由于还有归属感的考虑,因此即使只是卖糖水(而不是卖假糖水 )也愿意。

    千人计划想吸引的美欧名校正教授不仅生活还不错,工作也潇洒自在。以我本人为例。虽然辞去长江学者后就没有在国内工作的义务了,但这几年每年在国内大概还是有两个月以上,做一些自己愿意做的事 。虽然没有做成什么事,至少说明了在美国当教授的自由自在。当然工作还是有压力的,但到了正教授了这压力就主要是自找的,是因为希望作出流传后世的成果。对于已经有终生职位的教授,如果真想换一种生活方式,不做科研而享受生活(我知道这样的例子),虽然会承受一些系里的压力,但教授的工作和工资还是不会丢掉的(工资会长得慢一些)。另一方面,因为没有退休年龄限制,如果想多做科研,可以一直工作到死而后已。因此,在美国名校当正教授主要是跟着兴趣或者雄心走,进退自如,还是有一定的吸引力的,比我们羡慕的陶渊明的隐居生活不差。因此千人计划吸引的美欧名校正教授不多,而主要吸引了其他学者。

    多年前大概还有不少人认为归国能做一些事,至少能改善小环境,因此有一些在美国的教授/副教授/助理教授接受了还不如美国研究生的工资回去当了长江学者。但2006年长江学者们被攻击,英名付诸流水,不用说改变国内科研环境,能自保和不挨骂就不错了。从此愿意回国工作的少了一些,当然还是有一些人在逆境中做出了成果。千人计划刚推出时,我以为这个局面会被改变,但现在看来也没有改变。也许很多人都感觉到了归国了即使有一时的鲜花掌声,想做成事太难了,挨网民的骂少不了,可能还需要为5斗米折腰或者成为现有体制的牺牲品。即使改善小环境都难,有时还被环境改变。在国内这种唯有院士高的体制下,即使是学术水平比院士高的千人也不会对国内学术界有影响力,还可能受院士的压制,更谈不上改变学术环境了。至于有些归国学者还希望在国内成为院士,就更只能适应国内体制而被国内环境改变了。雄心壮志也只能被雨打风吹去了。也就是说,归国后不仅改变不了世界, 可能连正常的糖水都卖不了,只能到处折腰,说违心的话,甚至只能卖假糖水。那是不是还值得干呢?从已经回国的一些学者来看,虽然他们“费尽了移山心力”,不但没有“伟烈丰功”(抄大观楼长联),常常是困难地适应体制,有时毁誉参半,甚至声败名裂。
           
    前些日子看见韩国网站上有一篇文章“海外留学决定了韩中日命运”,分析中日韩海外留学生对本国的贡献,觉得有点“旁观者清”的味道,尤其是其中讲中国120名留美学童回国后要接受《圣谕广训》的再教育(事实的准确性我没有核对),无力推动中国的变化, 这与当今是不是有些类似呢? (朝鲜的留学生更惨。)全文转载如下:

 


朝鲜日报中文网
海外留学决定了韩中日命运

20世纪初,韩国和中国分别沦落为殖民地和半殖民地国家,而日本则成为强大的帝国。决定三个国家不同命运的是对接受西方文化的态度和时间上的差距。具有决定性意义的转折点是甲午战争(18941895年),但成败早在1880年就已经确定。 
1860年英法联军攻入北京后,中国认识到武器和技术上的差距,发起了洋务运动,但直到10年后才向海外派遣留学生。1872年清朝
政府向美国派遣120 10多岁的“留美幼童”,留学时间为15年,但1881年突然将他们全部召回国,并从1889年到1896年再也没派遣留学生。中国留学生们每周日都要聚集在官府,背诵含有雍正帝的儒教教育思想的《圣谕广训》。因此,他们无力推动中国的变化,仍然无法改变天子家臣的身份。 
日本在德川幕府时期派遣了第一批海外留学生,明治维新(1868年)后,留学生人数进一步增加,1873年超过了1000人。他们打着“文明开化”的旗帜吸收西方的一切文化与知识,早早去留学的伊藤博文、井上馨等早在19世纪80年
就已经身居要职,与国民分享留学成果。  
1881年前往日本学习近代文化的朝士视察团的一员鱼允中深刻认识到派遣留学生的必要性。他说:“
在是以知识的力量一决雌雄的时候。如果废除科举,那些追寻建功扬名的人将争相恐后地去国外学习才艺和知识回国。”

1883年,朝鲜王室向日本派遣了100多名留学生,但在1884年甲申政变失败后,他们被控了朝鲜的袁世凯全部拉往刑场斩首。 
近130年后的今天,在美国留学的学生人数逆转为中国、韩国、日本。不知这一变化又将对东方三国的未来产生什么样的影响

How to Increase Your Papers Citation and H Index


This paper will reveal to you how to increase your papers citation and boost your h index in 5 simple steps. From Nelson Tansu, the youngest professor in Indonesia, he reveals his secret how to raise your citation volume rapidly. Tansu is the youngest professor in Indonesia with over 200 papers and 2000 citations at the age of 33 years old. He is the only qualified professor to give this advice and to show real evidence. These are secret tricks held so long by Tansu, it is a treasure so remember it well, now only reveal to you. The 5 secret steps to increase your citation index.(1) Self citation and More Self CitationSelf citation can increase more than double of your citation index.
Cite your own work in every single paper. Cite a lot of it. Many self citation increase your papers citation. More than 50% self citation and you can double your h index very fast.
A research paper found that many self citations can increase the number of citations by others!
http://marginalrevolution.com/marginalrevolution/2007/04/does_selfcitati.html
A lot of self citation is the first important key!
Here is an evidence of the total self citation gain you can achive: with self citation, total citation is 1428, h index 23. Without self citation you only can have h index of 13.You can double citation volume and h index!
Here are some more examples

 (2) Double publication
Write a paper for a conference with many self citations. Then slightly change the title of the paper and publish it in a journal. Remember self citation. You now got 2 publications, and also double citation.

Example: Tansu wrote this paper  for Winter Topicals (WTM), 2011 IEEE Conference, title: Selective area epitaxy of ultra-high density InGaN based quantum dots
http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5730033
This paper has 8 self citations

Then he published the same paper (slightly change title) in Nanoscale Research Letters 2011, http://www.nanoscalereslett.com/content/6/1/342
Just need to slightly change the title to: "Selective area epitaxy of ultra-high density InGaN quantum dots by diblock copolymer lithography. And this paper has 19 Self Citations. The content of the paper remains the same.
So now you have 2 publications and also double your citations. 


(3) Rapid self citationWhen writing a new paper, quote as many of your own paper as possible in the first two sentences, even if they are not relevant. You can achieve 20 self citations just in the first 2 sentences. See an example in the prestigious famous Journal of Applied Physics, you can play the trick:
 
 
The first paragraph already have 30 references, you can have many self citations just in the first 2 sentences.Read the first sentence, and look at the reference! Is there a particular reason why in the first few words you need 19 references on light emitting diodes? And in the full sentence you need 26 references? I tell you now, the secret reason is of course can be found when you look at the reference list (figure below), because reference 1 until 24 has papers written by Tansu. I have highlighted all Tansu in the reference. Almost all have Tansu as author. Can you now see what is meant by self citation? Self citation is deliberately citing your own work even if it is not really relevant to the paper and with the intention to increase citation volume. This is how you increase citation index. The first sentence you cite as much as possible your own paper, even if it is not relevant.

(4) Goto SPIE Conference
  

Every year SPIE has many conferences around the states, attend one of them and submit 6 papers. SPIE don't have strict rules (unlike IEEE) on number of references. Do as many self citations as you can, even 40 self citation per paper.  You can have a large gain of 180 self citations just from 1 conference. 
In 2010, Tansu presented 6 papers at SPIE West, and let’s look at the number of self citation:
1. Novel approaches for high-efficiency InGaN quantum wells light-emitting diodes: Device physics and epitaxy engineering   Tansu, N., Zhao, H., Zhang, J., Liu, G., Li, X.-H., Ee, Y.-K., Song, R., Huang, G.S.      2011    Proceedings of SPIE - The International Society for Optical Engineering 7954, art. no. 795418
This paper has 40 self references to Tansu
Wow.. can you belive that? 40 self citation in a paper. This must be a world record in self citation. Tansu reported the best threshold in self citation.
2. Zhang, J., Zhao, H., Tansu, N.  2011    Gain characteristics of deep UV AlGaN quantum wells lasers. Proceedings of SPIE - The International Society for Optical Engineering 7953, art. no. 79530H
This paper has 27 self references to Tansu
3. Enhancement of light extraction efficiency of InGaN quantum wells light-emitting diodes using TiO2 microsphere arrays     Li, X.-H., Ee, Y.-K., Song, R., Tansu, N.      2011    Proceedings of SPIE - The
International Society for Optical Engineering 7954, art. no. 79540U
This paper has 27 self references to Tansu

4. Thermoelectric properties of MOCVD-grown AlInN alloys with various compositions Zhang, J.,Tong, H., Liu, G., Herbsommer, J.A., Huang, G.S., Tansu, N. 2011 Proceedings of SPIE – The International Society for Optical Engineering 7939, art. no. 79390O This paper has 27 self references to Tansu
5. Cathodoluminescence characteristics of linearly shaped staggered InGaN quantum wells light-emitting diodes Zhao, H., Zhang, J., Liu, G., Toma, T., Poplawsky, J.D., Dierolf, V., Tansu, N.2011 Proceedings of SPIE - The International Society for Optical Engineering 7939, art. no.793905 This paper has 27 self references to Tansu
6. Analysis of thermoelectric properties of AlInN semiconductor alloys Zhang, J., Tong, H.,Herbsommer, J.A., Tansu, N. 2011 Proceedings of SPIE - The International Society for OpticalEngineering 7933, art. no. 79330X
This paper has 28 self references to Tansu
Tansu really nail it hard this year, he got 176 citations from a single SPIE conference. Can you imagine that? 176 self citation from just 6 papers
This is how to get the most citation in as little time as possible. Tansu definitely gain a large self citation. This citation gain characteristic is so strong that Tansu reported the best threshold in self citation, ever! World’s fastest self citation record. Towards 200 self citations per SPIE conference.

(5) Quote more references and cover up your act, the important is the number of self citationTo make it less obvious that you are doing self citation, quote many other references as well. For example if you have 15 self citation, quote 80 other references in a paper. It looks like a small percent of self citation, but you already got 15 self citation. This is a clever trick that only Tansu can teach you.
A paper in Nature also said an easy way to boost paper citation is to include many more references
http://www.nature.com/news/2010/100813/full/news.2010.406.html But of course they forget about self citation is the most important
This paper was published in Energy Express, a supplement to Optics Express, Tansu can make an express self citation. This paper has 16 self citation!
But to cover up the dirty act, Tansu cited 92 references. This makes the self citation a small percentage, but in fact the actual number that counts. 16 self citations increase 16 citations. Who the hell wrote a research paper with 92 references? All other papers in Optics Express only has 10 to 20 references. Why the hell you need 92 references? This is clearly a dirty trick created by Tansu to cover up his self citation.
That’s all the 5 easy steps to increase your citation. I hope you can use it and double your h index.


如何提高你的论文引用和H指数本文将揭示你如何增加你的论文引文和提高您的H指数在5个简单的的步骤。从尼尔森箪,在印尼最年轻的教授,他透露他的秘密如何迅速提高您的引文量。箪是在印尼最年轻的200多篇论文和在2000年33岁的年龄引文的教授。他是唯一合格的教授给了这一建议,表现出真正的证据。这些都是由箪举行这么久的秘诀,它是个宝,所以记得很清楚,现在只能透露给你。秘密的5个步骤来增加您的引文索引。(1)自引和更多的自我引自引,可以增加更多的引文索引一倍以上。引用你自己的工作在每一个文件。引用了很多。许多自我引用,增加您的论文引用。 50%以上的自引,你可以双击你的h指数非常快。一个研究报告发现,许多自我引文可以增加被别人引用的次数!http://marginalrevolution.com/marginalrevolution/2007/04/does_selfcitati.html很多自我引用的第一个重要的关键!这里是一个总的自我引用的增益可以达到的证据:与自我引用,总引用是1428,H指数23。没有自我引用,你只能有h的13.You指数可以双引量和H指数!请参阅完整的引文数据库,http://whatisgenius.blogspot.com/2011/08/how-to-increase-tansu-h-index.html
 
下面是一些例子http://whatisgenius.blogspot.com/2011/05/self-consistent-citation-analysis.htmlhttp://whatisgenius.blogspot.com/2011/06/tansu-did-it-again-and-again.htmlhttp://whatisgenius.blogspot.com/2011/06/tansu-did-it-again-novel-self-citations.html



 
(2)双出版物写许多自我引文会议文件。然后稍微改变的文件的标题,并刊登在杂志上。记住我的自我引用。你现在有2出版物和双引。例:箪冬季Topicals写道本文(WTM),2011年IEEE会议,标题:选择区域外延的超高密度的量子点的InGaNhttp://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5730033本文有8个自我引文然后,他发表在纳米研究快报“2011年相同的纸张(稍微改变标题),http://www.nanoscalereslett.com/content/6/1/342只需要稍微改变标题:“选择性的超高密度InGaN量子点嵌段共聚物光刻技术领域的外延,本文自文献19的文件的内容保持不变。所以,现在你有2出版物和双重您的引文。
(3)快速自引用当编写一个新的文件,报价为尽可能多的前两句在自己的论文,即使他们是不相关的的。只是在第2句话可以实现20自参考文献。见负盛名的著名应用物理杂志“的一个例子,你可以玩的伎俩:
第一段已经有30引用,你可以有许多自我引文,只是在第2句话。读第一句,并参考看看!是否有一个特别的原因,为什么在第一几句话,你需要发光二极管19参考吗?完整的句子,你需要26引用吗?我现在告诉你,秘密的原因当然是可以发现,当你看在参考清单(如下图),因为参考1至24箪书面文件。我已经强调所有参考箪。几乎所有作者箪。你现在可以看到自引是什么意思?自引是故意引你自己的工作,即使它不是真正相关的文件和打算增加引量。这是你如何增加引文索引。你列举尽可能多的自己的论文,即使是不相关的的第一句。更多的例子在这里http://whatisgenius.blogspot.com/2011/05/self-citation-in-applied-physics.html(4)转到SPIE会议SPIE每年围绕国家的许多会议,参加其中之一,并提交6篇。 SPIE没有严格的规则(不像IEEE)的引用数。不要多达可以自行引用,甚至40%的纸张自我引用。你可以有一个刚刚从大会,1 180自我引文的大增益。在2010年,箪提出6论文在光学学报西,让我们来看看在自我引用:1。高效率的InGaN量子阱发光二极管器件物理和外延工程箪,北路,赵,H.,张,刘,G.,李,X.-H.,EE,Y:新方法.-光,宋,R.,黄,高盛2011年SPIE论文集 - 7954光学工程,艺术的国际协会。没有。 795418本文有40箪自我引用哇..你可以相信的,? 40在一篇论文中自我引用。这必须在自引用的世界纪录。箪报告中自我引用的最佳阈值。2。张,赵,H.,箪,北路2011深紫外光氮化铝镓量子阱激光器的增益特性。 SPIE论文集 - 7953光学工程,艺术的国际协会。没有。 79530H本文有27箪自我引用3。加强InGaN量子阱发光二极管使用二氧化钛微阵列李X.-H.,EE,Y.-K.,宋河,箪,N 2011 SPIE论文集光提取效率 -国际光学工程学会7954艺术。没有。 79540U本文有27箪自我引用4。张,J.,塘,H.,刘,G.,Herbsommer,司法机构政务长,黄,高盛,箪,N 2011 SPIE论文集各种成分的MOCVD生长AlInN合金热电性能 - 国际光学工程学会7939 ,艺术。没有。 79390O本文箪27日至自我引用5。直线形的阴极发光特性交错的InGaN量子阱发光二极管赵,H.,张,刘,G.,托马,T.,Poplawsky,法学博士,Dierolf,五,箪,N.2011 SPIE论文集 - 7939光学工程国际社会,艺术。 no.793905本文27日至箪自我6。热电性能研究AlInN半导体合金张,塘,H.,Herbsommer,JA,箪,N 2011 SPIE论文集 - OpticalEngineering 7933,艺术国际交流协会的分析。没有。 79330X本文有28箪自我引用箪真的指甲很难今年,他从一个单一的SPIE会议176参考文献。你能想象吗? 176只有6篇论文的自引这是如何在尽可能少的时间获取最引。箪肯定获得一个大的自我引用。这引用增益特性,是如此强烈,箪报告中自我引用的最好的门槛,不断!世界上最快的自我引用记录。建立SPIE会议的200%的自我引文。
(5)报价更多的参考资料,并掩盖了你的行为,重要的是自我引用为了让不太明显,你正在做自我引用,引用以及许多其他参考资料。例如,如果您有15自我引用,报价80在一份文件中的其他引用。它看起来像一个小百分比的自引,但你已经得到了15自我引用。这是一个聪明的把戏,只箪可以教你。在自然界中的文件还表示,一个简单的方法来提高纸张引用的是包括多引用http://www.nature.com/news/2010/100813/full/news.2010.406.html,但是,当然,他们忘记了自我引用是最重要的
 
本文是在光学快报的补充能源快车,出版,箪可以表达自我引用。本文有16个自引用!但为了掩盖肮脏的行为,箪引92引用。这使得自引所占的比例很小,但其实实际数量,计数。 16自我引文增加16引文。谁是地狱写了92引用的研究论文?光学快报所有其他文件,只有10至20引用。究竟为什么你需要92引用?这显然​​是箪创建一个肮脏的伎俩,来掩盖他的自我引用。这是所有的5个简单的步骤来增加您的引文。我希望你可以用它和您的H指数的两倍。