原文翻译:深度学习测试题(L1 W2 测试题)
導(dǎo)語(yǔ)
本文翻譯自deeplearning.ai的深度學(xué)習(xí)課程測(cè)試作業(yè),近期將逐步翻譯完畢,一共五門(mén)課。
翻譯:黃海廣
本集翻譯Lesson1 Week 2:
Lesson1 Neural Networks and Deep Learning (第一門(mén)課 神經(jīng)網(wǎng)絡(luò)和深度學(xué)習(xí))
Week 2 Quiz - Neural Network Basics(第二周測(cè)驗(yàn) - 神經(jīng)網(wǎng)絡(luò)基礎(chǔ))
1.What does a neuron compute?(神經(jīng)元節(jié)點(diǎn)計(jì)算什么?)
【 】 A neuron computes an activation function followed by a linear function (z = Wx + b)(神經(jīng)元節(jié)點(diǎn)先計(jì)算激活函數(shù),再計(jì)算線性函數(shù)(z = Wx + b))
【★】 A neuron computes a linear function (z = Wx + b) followed by an activation function(神經(jīng)元節(jié)點(diǎn)先計(jì)算線性函數(shù)(z = Wx + b),再計(jì)算激活。)
【 】 A neuron computes a function g that scales the input x linearly (Wx +b)(神經(jīng)元節(jié)點(diǎn)計(jì)算函數(shù)g,函數(shù)g計(jì)算(Wx + b))
【 】 A neuron computes the mean of all features before applying the output to an activation function(在將輸出應(yīng)用于激活函數(shù)之前,神經(jīng)元節(jié)點(diǎn)計(jì)算所有特征的平均值)
Note: The output of a neuron is a = g(Wx + b) where g is the activation function (sigmoid, tanh, ReLU, …).(注:神經(jīng)元的輸出是a = g(Wx + b),其中g(shù)是激活函數(shù)(sigmoid,tanh,ReLU,…))
2. Which of these is the “Logistic Loss”?(下面哪一個(gè)是Logistic損失?)
【★】損失函數(shù):
Note: We are using a cross-entropy loss function.(注:我們使用交叉熵?fù)p失函數(shù)。)
3. Suppose img is a (32,32,3) array, representing a 32x32 image with 3 color channels red, green and blue. How do you reshape this into a column vector?(假設(shè)img是一個(gè)(32,32,3)數(shù)組,具有3個(gè)顏色通道:紅色、綠色和藍(lán)色的32x32像素的圖像。如何將其重新轉(zhuǎn)換為列向量?)
Answer(答):
x = img.reshape((32 * 32 * 3, 1))
4. Consider the two following random arrays “a” and “b”:(看一下下面的這兩個(gè)隨機(jī)數(shù)組“a”和“b”:)
a = np.random.randn(2, 3) # a.shape = (2, 3) b = np.random.randn(2, 1) # b.shape = (2, 1) c = a + bWhat will be the shape of “c”?(請(qǐng)問(wèn)數(shù)組c的維度是多少?)
Answer(答):
c.shape = (2, 3)
b (column vector) is copied 3 times so that it can be summed to each column of a. Therefore, c.shape = (2, 3).( B(列向量)復(fù)制3次,以便它可以和A的每一列相加,所以:c.shape = (2, 3))
5. Consider the two following random arrays “a” and “b”:(看一下下面的這兩個(gè)隨機(jī)數(shù)組“a”和“b”)
a = np.random.randn(4, 3) # a.shape = (4, 3) b = np.random.randn(3, 2) # b.shape = (3, 2) c = a * bWhat will be the shape of “c”?(請(qǐng)問(wèn)數(shù)組“c”的維度是多少?)
Answer(答):
The computation cannot happen because the sizes don’t match. It’s going to be “error”!(無(wú)法進(jìn)行計(jì)算,因?yàn)榇笮〔黄ヅ洹?huì)報(bào)錯(cuò)!)
Note:“*” operator indicates element-wise multiplication. Element-wise multiplication requires same dimension between two matrices. It’s going to be an error.(注:運(yùn)算符 “*” 說(shuō)明了按元素乘法來(lái)相乘,但是元素乘法需要兩個(gè)矩陣之間的維數(shù)相同,所以這將報(bào)錯(cuò),無(wú)法計(jì)算。)
6. Suppose you have??input features per example. Recall that?. What is the dimension of X?(假設(shè)你的每一個(gè)樣本有個(gè)輸入特征,想一下在?中,X的維度是多少?)
Answer(答):
Note: A stupid way to validate this is use the formula
when, then we have(請(qǐng)注意:一個(gè)比較笨的方法是當(dāng)的時(shí)候,那么計(jì)算一下,所以我們就有:7. Recall that np.dot(a,b) performs a matrix multiplication on a and b, whereas a*b performs an element-wise multiplication.(回想一下,np.dot(a,b)在a和b上執(zhí)行矩陣乘法,而“a * b”執(zhí)行元素方式的乘法。)Consider the two following random arrays “a” and “b”:(看一下下面的這兩個(gè)隨機(jī)數(shù)組“a”和“b”:)
a = np.random.randn(12288, 150) # a.shape = (12288, 150)
b = np.random.randn(150, 45) # b.shape = (150, 45)
c = np.dot(a, b)What is the shape of c?(請(qǐng)問(wèn)c的維度是多少?)
Answer(答):
c.shape = (12288, 45), this is a simple matrix multiplication example.( c.shape = (12288, 45), 這是一個(gè)簡(jiǎn)單的矩陣乘法例子。)
8. Consider the following code snippet:(看一下下面的這個(gè)代碼片段:)
# a.shape = (3,4)
# b.shape = (4,1)
for i in range(3):for j in range(4):c[i][j] = a[i][j] + b[j]How do you vectorize this?(請(qǐng)問(wèn)要怎么把它們向量化?)
Answer(答):
c = a + b.T
9. Consider the following code:(看一下下面的代碼:)
a = np.random.randn(3, 3) b = np.random.randn(3, 1) c = a * bWhat will be c?(請(qǐng)問(wèn)c的維度會(huì)是多少?)
Answer(答):
c.shape = (3, 3)
This will invoke broadcasting, so b is copied three times to become (3,3), and * is an element-wise product so c.shape = (3, 3).(這將會(huì)使用廣播機(jī)制,b會(huì)被復(fù)制三次,就會(huì)變成(3,3),再使用元素乘法。所以:c.shape = (3, 3).)
10. Consider the following computation graph,What is the output J.(看一下下面的計(jì)算圖,J輸出是什么:)
J = u + v - w
= a * b + a * c - (b + c)
= a * (b + c) - (b + c)
= (a - 1) * (b + c)
Answer(答):
J=(a - 1) * (b + c)
備注:公眾號(hào)菜單包含了整理了一本AI小抄,非常適合在通勤路上用學(xué)習(xí)。
往期精彩回顧2019年公眾號(hào)文章精選適合初學(xué)者入門(mén)人工智能的路線及資料下載機(jī)器學(xué)習(xí)在線手冊(cè)深度學(xué)習(xí)在線手冊(cè)AI基礎(chǔ)下載(第一部分)備注:加入本站微信群或者qq群,請(qǐng)回復(fù)“加群”加入知識(shí)星球(4500+用戶(hù),ID:92416895),請(qǐng)回復(fù)“知識(shí)星球”喜歡文章,點(diǎn)個(gè)在看
總結(jié)
以上是生活随笔為你收集整理的原文翻译:深度学习测试题(L1 W2 测试题)的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: Python地图可视化三大秘密武器
- 下一篇: 原文翻译:深度学习测试题(L1 W1 测