전체 글
-
[LeetCode] 1470. Shuffle the ArrayAlgorithm/LeetCode 2021. 9. 24. 09:05
Given the array nums consisting of 2n elements in the form [x1,x2,...,xn,y1,y2,...,yn]. Return the array in the form [x1,y1,x2,y2,...,xn,yn]. Example 1: Input: nums = [2,5,1,3,4,7], n = 3 Output: [2,3,5,4,1,7] Explanation: Since x1=2, x2=5, x3=1, y1=3, y2=4, y3=7 then the answer is [2,3,5,4,1,7]. Example 2: Input: nums = [1,2,3,4,4,3,2,1], n = 4 Output: [1,4,2,3,3,2,4,1] Example 3: Input: nums =..
-
[Leetcode] 1742. Maximum Number of Balls in a BoxAlgorithm/LeetCode 2021. 9. 23. 16:49
You are working in a ball factory where you have n balls numbered from lowLimit up to highLimit inclusive (i.e., n == highLimit - lowLimit + 1), and an infinite number of boxes numbered from 1 to infinity. Your job at this factory is to put each ball in the box with a number equal to the sum of digits of the ball's number. For example, the ball number 321 will be put in the box number 3 + 2 + 1 ..
-
[Loss] 크로스 엔트로피Implementation/Loss 2021. 9. 22. 13:07
- Cross Entropy Entropy 는 확률분포 p 가 가지는 정보의 확신도 혹은 정보량을 수치로 표현한 것이며, Cross-Entropy 는 정보 이론에서, 두 확률 분포 p 와 q를 구분하기 위해 필요한 평균 비트 수를 의미한다. # Define data p = [0.10, 0.40, 0.50] q = [0.80, 0.15, 0.05] # Define Entropy def entropy(p): return -sum([p[i] * np.log2(p[i]) for i in range(len(p))]) # Define Cross-Entropy def cross_entropy(p, q): return -sum([p[i] * np.log2(q[i]) for i in range(len(p))]) 정보 이..
-
[RNN] 파라미터 개수 카운팅Implementation/Text 2021. 9. 21. 13:55
# Simple RNN 파라미터 참고 # => output_dim, ( time-length or setence-length, input_dim) # => Dh, (t, d) from keras.models import Sequential from keras.layers import SimpleRNN model = Sequential() model.add(SimpleRNN(3, input_shape=(2,10))) # model.add(SimpleRNN(3, input_length=2, input_dim=10))와 동일함. model.summary() _________________________________________________________________ Layer (type) Output ..