-
Notifications
You must be signed in to change notification settings - Fork 0
/
arguments.txt
60 lines (54 loc) · 1.41 KB
/
arguments.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
This is a complete json file containing every input argument:
{
"units": "relu",
"alpha": 0.74,
"lambda": 0.000191,
"n_hid": [100],
"reg": "L2",
"maxnorm_lim": [],
"classify": "softmax",
"dropout": false,
"droplim": [1.0,0.8,1.0],
"epochs": 24,
"mb_size_in": 50,
"norm_mode": "none",
"opt": "adam",
"opt_params": [0.9, 0.999],
"learn_decay": [0.5,4.0],
"dobatch": true,
"do_batch_norm": true,
"sparse": false,
"initializer": "xavier",
"quiet": true,
"shuffle": false
"plots": ["Train", "Learning", "Test"],
"plot_now": true
}
plot_now is passed separately in the run_training function signature
This is a Dict constructor containing every input argument:
Dict(
"units"=> "relu",
"alpha"=> 0.74,
"lambda"=> 0.000191,
"n_hid"=> [100],
"reg"=> "L2",
"maxnorm_lim"=> [],
"classify"=> "softmax",
"dropout"=> false,
"droplim"=> [1.0,0.8,1.0],
"epochs"=> 24,
"mb_size_in"=> 50,
"norm_mode"=> "none",
"opt"=> "adam",
"opt_params"=> [0.9, 0.999],
"learn_decay"=> [0.5,4.0],
"dobatch"=> true,
"do_batch_norm"=> true,
"sparse"=> false,
"initializer"=> "xavier",
"quiet"=> true,
"shuffle"=> false
"plots"=> ["Train", "Learning", "Test"],
"plot_now"=> true
)
plot_now is passed separately in the run_training function signature