text "3.4 draw a version of figure 3.3 where the y-intercept and slope of the third hidden unit have changed as in figure 3.14c. assume that the remaining parameters remain the same. figure 3.14 processing in network with one input, three hidden units, and one outputforproblem3.4. a–c)theinputtoeachhiddenunitisalinearfunctionof the inputs. the first two are the same as in figure 3.3, but the last one differs. problem 3.5 prove that the following property holds for α∈r+: relu[α·z]=α·relu[z]. (3.14) this is known as the non-negative homogeneity property of the relu function. draft: please send errata to udlbookmail@gmail.com.40 3 shallow neural networks problem 3.6 following on from problem 3.5, what happens to the shallow network defined in equations 3.3 and 3.4 when we multiply the parameters θ and θ by a positive constant α 10 11 and divide the slope ϕ by the same parameter α? what happens if α is negative? 1 problem3.7considerfittingthemodelinequation3.1usingaleastsquareslossfunction. does this loss function have a unique minimum? i.e., is there a single “best” set of parameters? problem 3.8 considerreplacingthereluactivationfunctionwith(i)theheavisidestepfunc- tion heaviside[z], (ii) the hyperbolic tangent function tanh[z], and (iii) the rectangular func- tion rect[z], where: 8 ( ><0 z<0 0 z<0 heaviside[z]= rect[z]= 1 0≤z≤1. (3.15) 1 z≥0 >: 0 z>1 redraw a version of figure 3.3 for each of these functions. the original parameters were: ϕ= {ϕ ,ϕ ,ϕ ,ϕ ,θ ,θ ,θ ,θ ,θ ,θ }={−0.23,−1.3,1.3,0.66,−0.2,0.4,−0.9,0.9,1.1,−0.7}. 0 1 2 3 10 11 20 21 30 31 provideaninformaldescriptionofthefamilyoffunctionsthatcanbecreatedbyneuralnetworks with one input, three hidden units, and one output for each activation function. problem 3.9∗ show that the third linear region in figure 3.3 has a slope that is the sum of the slopes of the first and fourth linear regions. problem 3.10 consider a neural network with one input, one output, and three hidden units. the construction in figure 3.3 shows how this creates four linear regions. under what circum- stances could this network produce a function with fewer than four linear regions? problem 3.11∗ how many parameters does the model in figure 3.6 have? problem 3.12 how many parameters does the model in figure 3.7 have? problem3.13whatistheactivationpatternforeachofthesevenregionsinfigure3.8? inother words, which hidden units are active (pass the input) and which are inactive (clip the input) for each region? problem 3.14 write out the equations that define the network in figure 3.11. there should be three equations to compute the three hidden units from the inputs and two equations to compute the outputs from the hidden units. problem3.15∗ whatisthemaximumpossiblenumberof3dlinearregionsthatcanbecreated by the network in figure 3.11? problem 3.16 write out the equations for a network with two inputs, four hidden units, and three outputs. draw this model in the style of figure 3.11. problem 3.17∗ equations 3.11 and 3.12 define a general neural network with d inputs, one i hiddenlayercontainingd hiddenunits,andd outputs. findanexpressionforthenumberof o parameters in the model in terms of d , d, and d . i o problem 3.18∗ show that the maximum number of regions created by a shallow network withd =2-dimensionalinput,d =1-dimensionaloutput,andd=3hiddenunitsisseven,as i o in figure 3.8j. use the result of zaslavsky (1975) that the max"