Thursday, 16 February 2012

How to find Nth highest Salary


Query to Retrieve the Nth Maximum Value


Replace Employee with your table name, and Salary with your column name. Where N is the level of Salary to be determined.
SELECT *
FROM Employee E1
WHERE (N-1) = (
SELECT COUNT(DISTINCT(E2.Salary))
FROM Employee E2
WHERE E2.Salary > E1.Salary)

In the above example, the inner query uses a value of the outer query in its filter condition meaning; the inner query cannot be evaluated before evaluating the outer query. So each row in the outer query is evaluated first and the inner query is run for that row.

IBM Interview Questions


These are latest Interview Questions which I occurred these days in IBM

Technical - 1
1.       Tell me about yourself? ( which is common question )
2.       Tell me which technologies you have used up to now and how can rate yourself on these technologies? ( The next Questions will be asked on this rating. So, plan it and answer )
3.       Tell me about your project? Think I am your customer and came to you. You need to explain how you can explain?
4.       What are the roles and responsibilities you are handling day to day life?
5.       Tell me the your project architecture?
6.       What are the testing processes which are considered for your project?
7.       Where the clients have faced any problems on which you have developed? Can you explain
8.       Scenario: you have two arraylists as al1 and al2, as shown in below
al1(arraylist object           al2(arraylist object)
e1(employee object)     e1(employee object)
e2                                           e2
e3                                           e3
e4                                           e7
e5                                           e6
Here we are adding employee objects(e1,e2,e3,e4,e5) into al1 of arraylist and employee objects(e1,e2,e3,e7,e6) into al2 of arraylist.
Now you have to compare two objects and you have to delete the common objects such as (e1, e2, e3) in al1 and at the last there should be only e4 and e5. How could you write the program?
9.       Why String class called as immutable?
10.   Can you make a class as immutable? If yes how can you do?
11.   How many action classes are there?
12.   Is there any action class which is similar to the dispatcher action class?
13.   What is the difference between forward() and sendRedirect()?
14.   Why springs you have used in your project? And where you are comfortable with spring?
15.   We have MVC architecture as Model, View and Controller. We can use directly as one jsp and one        servlet( which haves all business logic and database logic) and the direct procedure performance will improved. Why do we need to use this?
16.   How you will integrate struts, spring and hibernate?
17.   How to use joins?
18.   Write a query how to get top 5th highest salary?

Technical – 2
1.       How to read the data from console and print?
2.       Do you have any knowledge on Streams?
3.       What are client validations? Which technology you have used for client validations?
4.       How to use getElementId?
5.       What is the difference between Set and List?
6.       Explain what is an object? Explain in Technical way and non technical way.( Tell me in one line )
7.       When you will tell it is object oriented?
8.       What are object oriented concepts?
9.       What is polymorphism?
10.   What is Exception handling?( Tell me in one line )
11.   What is the difference between compile time exceptions and runtime exceptions?

Friday, 10 February 2012

Sorting algorithms


Introduction

I grew up with the bubble sort, in common, I am sure, with many colleagues. Having learnt one sorting algorithm, there seemed little point in learning any others, it was hardly an exciting area of study. Mundane sorting may be, but it is also central to many tasks carried out on computer. Prompted by its inclusion in the AEB 'A' level syllabus, I looked at the process again in detail. The efficiency with which sorting is carried out will often have a significant impact on the overall efficiency of a program. Consequently there has been much research and it is interesting to see the range of alternative algorithms that have been developed.
It is not always possible to say that one algorithm is better than another, as relative performance can vary depending on the type of data being sorted. In some situations, most of the data are in the correct order, with only a few items needing to be sorted. In other situations the data are completely mixed up in a random order and in others the data will tend to be in reverse order. Different algorithms will perform differently according to the data being sorted. Four common algorithms are the exchange or bubble sort, the selection sort, the insertion sort and the quick sort.
The selection sort is a good one to use with students. It is intuitive and very simple to program. It offers quite good performance, its particular strength being the small number of exchanges needed. For a given number of data items, the selection sort always goes through a set number of comparisons and exchanges, so its performance is predictable.
procedure SelectionSort ( d: DataArrayType; n: integer) {n is the number of elements}
for k = 1 to n-1 do
begin
small = k
for j = k+1 to n do
if d[ j ] < d[small] then small = j
{Swap elements k and small}
Swap(d, k, small)
end

The below are different sorting algorithms:
  1. Exchange (Bubble) Sort
  2. Insertion Sort
  3. Selection Sort
  4. Shell Sort
  5. Merge Sort
  6. Heap Sort
  7. Quick Sort
  8. Quick3 Sort

Comparing the Algorithms

There are two important factors when measuring the performance of a sorting algorithm. The algorithms have to compare the magnitude of different elements and they have to move the different elements around. So counting the number of comparisons and the number of exchanges or moves made by an algorithm offer useful performance measures. When sorting large record structures, the number of exchanges made may be the principal performance criterion, since exchanging two records will involve a lot of work. When sorting a simple array of integers, then the number of comparisons will be more important.
It has been said that the only thing going for the bubble (exchange) sort is its catchy name. The logic of the algorithm is simple to understand and it is fairly easy to program. It can also be programmed to detect when it has finished sorting. The selection sort, by comparison, always goes through the same amount of work regardless of the data and the quick sort performs particularly badly with ordered data. However, in general the bubble sort is a very inefficient algorithm.
The insertion sort is a little better and whilst it cannot detect that it has finished sorting, the logic of the algorithm means that it comes to a rapid conclusion when dealing with sorted data.
The selection sort is a good one to use with students. It is intuitive and very simple to program. It offers quite good performance, its particular strength being the small number of exchanges needed. For a given number of data items, the selection sort always goes through a set number of comparisons and exchanges, so its performance is predictable.
The first three algorithms all offer O(n2) performance, that is sorting times increase with the square of the number of elements being sorted. That means that if you double the number of elements being sorted, then there will be a four-fold increase in the time taken. Ten times more elements increases the time taken by a factor of 100! This is not a problem with small data sets, but with hundreds or thousands of elements, this becomes very significant. With most large data sets, the quick sort is a vastly superior algorithm (although as you might expect, it is much more complex), as the table below shows.

Random Data Set: Number of comparisons made

Sort/Elements50100200300400500
Selection Sort12254950199004485079800124750
Exchange Sort14105335203004565079866126585
Insertion Sort13915399204734444978779123715
Quick Sort3999901954338450666256
It should be pointed out that the methods above all belong to one family, they are all internal sorting algorithms. This means that they can only be used when the entire data structure to be sorted can be held in the computer's main memory. There will be situations where this is not possible, for example when sorting a very large transaction file which is stored on, say, magnetic tape or disc. Then an external sorting algorithm will be needed.

Quick3 Sort


Discussion

The 3-way partition variation of quick sort has slightly higher overhead compared to the standard 2-way partition version. Both have the same best, typical, and worst case time bounds, but this version is highly adaptive in the very common case of sorting with few unique keys.
When stability is not required, 3-way partition quick sort is the general purpose sorting algorithm of choice.
The 3-way partitioning code shown above is written for clarity rather than optimal performance; it exhibits poor locality, and performs more swaps than necessary. A more efficient but more elaborate 3-way partitioning method is given in Quicksort is Optimal by Robert Sedgewick and Jon Bentley.

Algorithm

# choose pivot
swap a[n,rand(1,n)]

# 3-way partition
i = 1, k = 1, p = n
while i < p,
  if a[i] < a[n], swap a[i++,k++]
  else if a[i] == a[n], swap a[i,--p]
  else i++
end
→ invariant: a[p..n] all equal
→ invariant: a[1..k-1] < a[p..n] < a[k..p-1]

# move pivots to center
m = min(p-k,n-p+1)
swap a[k..k+m-1,n-m+1..n]

# recursive sorts
sort a[1..k-1]
sort a[n-p+k+1,n]

Properties

  • Not stable
  • O(lg(n)) extra space
  • O(n2) time, but typically O(n·lg(n)) time
  • Adaptive: O(n) time when O(1) unique keys

References

Programming Pearls by Jon Bentley. Addison Wesley, 1986.
Quicksort is Optimal by Robert Sedgewick and Jon Bentley, Knuthfest, Stanford University, January, 2002.
Bubble-sort with Hungarian ("Csángó") folk dance YouTube video, created at Sapientia University, Tirgu Mures (Marosvásárhely), Romania.
Select-sort with Gypsy folk dance YouTube video, created at Sapientia University, Tirgu Mures (Marosvásárhely), Romania.
The Beauty of Sorting YouTube video, Dynamic Graphics Project, Computer Systems Research Group, University of Toronto.

Quick Sort


Quick Sort

Element12345678
Data27631726458149
1st pass19637264581427
2nd pass19142764587263
3rd pass19142758637264
4th pass19142758636472 sorted!
The quick sort takes the last element (9) and places it such that all the numbers in the left sub-list are smaller and all the numbers in the right sub-list are bigger. It then quick sorts the left sub-list ({1}) and then quick sorts the right sub-list ({63,72,64,58,14,27}). This is a recursive algorithm, since it is defined in terms of itself. This reduces the complexity of programming it, however it is the least intuitive of the four algorithms.

Discussion

When carefully implemented, quick sort is robust and has low overhead. When a stable sort is not needed, quick sort is an excellent general-purpose sort -- although the 3-way partitioning version should always be used instead.
The 2-way partitioning code shown above is written for clarity rather than optimal performance; it exhibits poor locality, and, critically, exhibits O(n2) time when there are few unique keys. A more efficient and robust 2-way partitioning method is given in Quicksort is Optimal by Robert Sedgewick and Jon Bentley. The robust partitioning produces balanced recursion when there are many values equal to the pivot, yielding probabilistic guarantees of O(n·lg(n)) time and O(lg(n)) space for all inputs.
With both sub-sorts performed recursively, quick sort requires O(n) extra space for the recursion stack in the worst case when recursion is not balanced. This is exceedingly unlikely to occur, but it can be avoided by sorting the smaller sub-array recursively first; the second sub-array sort is a tail recursive call, which may be done with iteration instead. With this optimization, the algorithm uses O(lg(n)) extra space in the worst case.

Algorithm

# choose pivot
swap a[1,rand(1,n)]

# 2-way partition
k = 1
for i = 2:n, if a[i] < a[1], swap a[++k,i]
swap a[1,k]
→ invariant: a[1..k-1] < a[k] <= a[k+1..n]

# recursive sorts
sort a[1..k-1]
sort a[k+1,n]

Properties

  • Not stable
  • O(lg(n)) extra space (see discussion)
  • O(n2) time, but typically O(n·lg(n)) time
  • Not adaptive

References

Programming Pearls by Jon Bentley. Addison Wesley, 1986.
Quicksort is Optimal by Robert Sedgewick and Jon Bentley, Knuthfest, Stanford University, January, 2002.
Bubble-sort with Hungarian ("Csángó") folk dance YouTube video, created at Sapientia University, Tirgu Mures (Marosvásárhely), Romania.
Select-sort with Gypsy folk dance YouTube video, created at Sapientia University, Tirgu Mures (Marosvásárhely), Romania.
The Beauty of Sorting YouTube video, Dynamic Graphics Project, Computer Systems Research Group, University of Toronto.


Heap Sort


Discussion

Heap sort is simple to implement, performs an O(n·lg(n)) in-place sort, but is not stable.
The first loop, the Θ(n) "heapify" phase, puts the array into heap order. The second loop, the O(n·lg(n)) "sortdown" phase, repeatedly extracts the maximum and restores heap order.
The sink function is written recursively for clarity. Thus, as shown, the code requires Θ(lg(n)) space for the recursive call stack. However, the tail recursion in sink() is easily converted to iteration, which yields the O(1) space bound.
Both phases are slightly adaptive, though not in any particularly useful manner. In the nearly sorted case, the heapify phase destroys the original order. In the reversed case, the heapify phase is as fast as possible since the array starts in heap order, but then the sortdown phase is typical. In the few unique keys case, there is some speedup but not as much as in shell sort or 3-way quicksort.

Algorithm

# heapify
for i = n/2:1, sink(a,i,n)
→ invariant: a[1,n] in heap order

# sortdown
for i = 1:n,
    swap a[1,n-i+1]
    sink(a,1,n-i)
    → invariant: a[n-i+1,n] in final position
end

# sink from i in a[1..n]
function sink(a,i,n):
    # {lc,rc,mc} = {left,right,max} child index
    lc = 2*i
    if lc > n, return # no children
    rc = lc + 1
    mc = (rc > n) ? lc : (a[lc] > a[rc]) ? lc : rc
    if a[i] >= a[mc], return # heap ordered
    swap a[i,mc]
    sink(a,mc,n)

Properties

  • Not stable
  • O(1) extra space (see discussion)
  • O(n·lg(n)) time
  • Not really adaptive

References

Programming Pearls by Jon Bentley. Addison Wesley, 1986.
Quicksort is Optimal by Robert Sedgewick and Jon Bentley, Knuthfest, Stanford University, January, 2002.
Bubble-sort with Hungarian ("Csángó") folk dance YouTube video, created at Sapientia University, Tirgu Mures (Marosvásárhely), Romania.
Select-sort with Gypsy folk dance YouTube video, created at Sapientia University, Tirgu Mures (Marosvásárhely), Romania.
The Beauty of Sorting YouTube video, Dynamic Graphics Project, Computer Systems Research Group, University of Toronto.

Merge Sort


Discussion

Merge sort is very predictable. It makes between 0.5*lg(n) and lg(n) comparisons per element, and between lg(n) and 1.5*lg(n) swaps per element. The minima are achieved for already sorted data; the maxima are achieved, on average, for random data. If using Θ(n) extra space is of no concern, then merge sort is an excellent choice: It is simple to implement, and it is the only stable O(n·lg(n)) sorting algorithm. Note that when sorting linked lists, merge sort requires only Θ(lg(n)) extra space (for recursion).
Merge sort is the algorithm of choice for a variety of situations: when stability is required, when sorting linked lists, and when random access is much more expensive than sequential access (for example, external sorting on tape).
There do exist linear time in-place merge algorithms for the last step of the algorithm, but they are both expensive and complex. The complexity is justified for applications such as external sorting when Θ(n) extra space is not available.

Algorithm

# split in half
m = n / 2

# recursive sorts
sort a[1..m]
sort a[m+1..n]

# merge sorted sub-arrays using temp array
b = copy of a[1..m]
i = 1, j = m+1, k = 1
while i <= m and j <= n,
    a[k++] = (a[j] < b[i]) ? a[j++] : b[i++]
    → invariant: a[1..k] in final position
while i <= m,
    a[k++] = b[i++]
    → invariant: a[1..k] in final position

Properties

  • Stable
  • Θ(n) extra space for arrays (as shown)
  • Θ(lg(n)) extra space for linked lists
  • Θ(n·lg(n)) time
  • Not adaptive
  • Does not require random access to data

References

Programming Pearls by Jon Bentley. Addison Wesley, 1986.
Quicksort is Optimal by Robert Sedgewick and Jon Bentley, Knuthfest, Stanford University, January, 2002.
Bubble-sort with Hungarian ("Csángó") folk dance YouTube video, created at Sapientia University, Tirgu Mures (Marosvásárhely), Romania.
Select-sort with Gypsy folk dance YouTube video, created at Sapientia University, Tirgu Mures (Marosvásárhely), Romania.
The Beauty of Sorting YouTube video, Dynamic Graphics Project, Computer Systems Research Group, University of Toronto.