Tuesday 22 December 2015

Am I ready for Google Interviews?

I have been preparing for the Google Interview for 4 weeks. I spent most of the time on coding and algorithms following the Cracking-the-Coding-Interview (V5).pdf. After I finished the book, I wonder "Am I ready to send my CV and ready to take interviews?" Following this thought, I list down the key criteria for interviews and how well prepared I am right now.

Criteria 1: Coding (using Java)
1.1: Coding style [DONE]
I found the Google Java coding style here: https://google.github.io/styleguide/javaguide.html
examples:
a) correctly use spaces
b) use functions to make code more readable.

1.2: Coding speed
There is no good way to speed up coding in 1 week, especially when I cannot use any IDE.
So, just keep practice.

1.3: Code readability
I found one website that I can read and comment others' code here:
http://codereview.stackexchange.com/

1.4: Know Java
I found this place great. Spending spare time on this website and check whether I know the topics in the website.
http://www.tutorialspoint.com/java_technology_tutorials.htm

1.5: Know the coding environment
Google Document
and
Coderpad

Criteria 2: Algorithm

2.1: Searching and sorting algorithms

Simple:
Bubble Sort
Insertion Sort
Selection Sort

Efficient:
Merge Sort
Heap Sort
Quick Sort

Special:
Counting Sort
Radix Sort

2.2: Hash tables
Linked List
Initial Capacity
Load factor
Hash function

2.3: Trees
Traversal
InOrderTraversal -> Stack + Pointer p
PreOrderTraversal -> Stack + Right & Left
PostOrderTraversal -> Stack + Stack
LevelTraversal -> ArrayList + ArrayList

Rebuild
Pre + In -> Recursive + Map of In
Post + In -> Recursive + Map of In
Pre + Post -> No unique Tree

Balanced Tree
Black-Red Tree
R*-Tree & R+-Tree

Graph:
Breath First Search
Depth First Search

2.4: Classic computer science problems
Graph Shortest Path: Dijkstra and A* Algorithm

Knapsack problem:
  • 0-1 knapsack problem: Dynamic Programming with v[I][W]
  • bounded knapsack problem: displace all values;
  • unbounded knapsack problem: 
Traveling Salesman Problem:

Minimum Spanning Tree:prime算法、kruskal算法


2.5: Theory of Computing
Big-O for space and time
Basic:
Master theorem

Criteria 3: System Design
3.1: System Definition 


3.2: Rough calculation of system requirement


3.3: Communication skills


3.4: Solutions: Data Storage, Data Processing Framework, Hardware, GUI


3.5: Features sets, interfaces, class hierarchies


Criteria 4: Interview Hints
4.1: Talk through your thought processes.

4.2: Ask clarifying questions if you do not understand the problem or need more information.

4.3: Think about ways to improve the solution you'll present.

4.4: Show an interest in Google products.

Criteria 5: Training Interview
5.1: https://leetcode.com/

5.2: glassdoor



Sunday 18 October 2015

Machine Learning Algorithms (Good Summaries in Web)

1.
Here is a cheat sheet that shows which algorithms perform best at which tasks.
http://www.lauradhamilton.com/machine-learning-algorithm-cheat-sheet

2. A Tour of Machine Learning Algorithms
http://machinelearningmastery.com/a-tour-of-machine-learning-algorithms/
.

Different meeting

1. Team meetings

Prepare the list of updates:

2: Team briefing

3: Negotiation
A: I accept that
B: I reject that
C: Another option is that ....

4: Project Introduction Meeting

5: Brainstorm meeting

6: Meal-time meeting

7: Project Scope Meeting

Tuesday 6 October 2015

Machine Learning Some Maps

A number of maps that will help to reconstruct the knowledge of Machine Learning ...

1. Classic Machine Learning

2. More detailed

3. Data Mining

4.
Terminology:
BESOM: A Cerebral Cortex Model based on Bayesian Networks
DeSTIN: a scalable deep learning architecture that relies on a combination of unsupervised learning and Bayesian inference
DP: Dynamic programming

DPM: Dirichlet Process Mixture
EM: Expectation–maximization algorithm

HDP: Hierarchical Dirichlet process
HMM: Hidden Markov Model
HPYLM: Hierarchical Pitman-Yor Language Model
HTM: Hierarchical Temporal Memory
iHMM: infinite Hidden Markov Model
MCMC: Markov chain Monte Carlo
NPYLM: Nested Pitman-Yor Language Model
SOINN: Self-Organizing Incremental Neural Network
SOM: Self-Organizing Map


5. Another View


Sunday 9 August 2015

Statistical Distribution REview

1: Why Distribution?
  • Distribution reflects a STRONG PATTERN in the DATA!!
  • In some cases, Patterns are more important to study then DATA itself.

2: What is Distribution?
  • Cumulative Distribution Function (CDF)
  • Probability Mass Function (PMF) 
  • Probability Distribution Function (PDF)
--------------------------
Common Distributions and When to USE:
-------------------
3. Normal Distribution
When?
  • If a variable depends on a large number of independent non-dominating parameters, the variable tends to follow Normal (Gaussian) Distribution. e.g
  • Measure noise
  • Human heights
  • Exam grades
What is like? [Two parameter distribution]















Standard Normal Distribution: N(0, 1)

4. Uniform Distribution
When?
  • Each value in X has the same possibility.

5. Exponential Distribution (Continuous)
When? 
  • model the waiting time for the next event to happen, e.g.
  • the next vehicle passing a line
  • bank service time duration
What?
















6. Geometric Distribution (Discrete)
When
  • Fail times util success Once, similar with Exponential
What
7. Binomial distribution (Discrete)
When
  • Try N times and success K times
  • When  N -> INFINITY, it becomes Normal

What



8. Poisson Distribution
When
  • Counts of random events, e.g.
  • number of phone calls received one day
What

9. Chi-2 Distribution

Definition
X1^2 + X2^2 + X3^2 + ... + Xk^2
Given,
Xi ~ N(0, 1)

When
Testing

What

10. F-Distribution
When


What

11. T-Distribution











Sunday 12 July 2015

Apple: Notification Program

1. Register an Apple Developer ID
     149 A$ per year
2. Build the development environment.
    iOS: 8.2
    XCode: 6.4
    Object-C:
3. Develop a server-end program that outputs notifications.
example code:
-------------------------------
public static ArrayList<String> tokens = new ArrayList<String>();
public static File keystore_file;
public static String keystore_password = "hongyu";

public static void init() {
keystore_file = new File("DemoCertificate.p12");
  tokens.add("5af0f958ae7bc0b20ff58789e3605ccec0f24485e119ae4aa939d1f95a6b8798");
}

public static void sendAnAlert(String message) throws CommunicationException, KeystoreException, JSONException {
PushNotificationPayload payload = PushNotificationPayload.complex();
payload.addAlert(message);
payload.addCustomDictionary("key1", "Value 1");
List<PushedNotification> notifications = Push.payload(payload, keystore_file, keystore_password, false, tokens);
printPushedNotifications(notifications);
}

4.That's ALL.

Sunday 21 June 2015

Machine Learning Course 14 Study Log (Recommendation System)

Little emphasises in Academic but large in Industry:
(1) 
(2) Context-based Recommendation System

(3) 
(4)
(5) 
(6) 
(7)