December 01, 2015
On 11/30/2015 09:13 PM, Andrei Alexandrescu wrote:
> I haven't worked out the math for insertion and deletion yet, but they
> seem to also be around O(sqrt n) or O(log(n) * sqrt(n)).

(Assuming the bucket sizes actually grow linearly.) It seems to me that for insertion O(√n̅ log n) is very easy to achieve, but not for deletion. (For insertion, one simple strategy that satisfies the bound is to repeatedly move the maximum to the next heap [1].)

Deletion leaves a hole in one heap, which should be filled by the minimum element of the next heap, etc. The minimum cannot be extracted efficiently from a max-heap.



[1]
struct Fancy(T){
  T[] a;
  void insert(T t){
    size_t i=1,j=0;
    for(;j+i<=a.length;j+=i,i++){
      if(!(t<a[j])) continue;
      swap(t,a[j]);
      for(size_t c=0,w=1;w<i;c=w,w=2*c+1){
        if(w+1<i&&a[j+w]<a[j+w+1]) w++;
        if(!(a[j+c]<a[j+w])) break;
        swap(a[j+c],a[j+w]);
      }
    }
    a~=t;
    for(auto c=a.length-1-j;c;c/=2){
      if(!(a[j+c/2]<a[j+c])) break;
      swap(a[j+c/2],a[j+c]);
    }
  }
}

December 01, 2015
The key search phrase is "cache oblivious data structures". One of the cache-friendly layouts is the van Emde Boas tree.
December 01, 2015
On 11/30/2015 09:57 PM, Andrei Alexandrescu wrote:
> So now consider my square heaps. We have O(n) build time (just a bunch
> of heapifications) and O(sqrt n) search.

How do you build in O(n)? (The initial array is assumed to be completely unordered, afaict.)
December 01, 2015
On 11/30/2015 06:19 PM, Titus Nicolae wrote:
> On Monday, 30 November 2015 at 20:13:15 UTC, Andrei Alexandrescu wrote:
> ...
>> smallest organized in a max heap, ... etc. There are a total of O(sqrt
>> n) heaps, and each has O(sqrt n) elements.
> ...
>
> Hi Andrei,
> If I'm not mistaken, the number of heaps is proportional to n^(1/3) not
> n^(1/2)
> 1+2^2+..+n^2 is proportional to n^3
> https://en.wikipedia.org/wiki/Square_pyramidal_number
> If n heaps have n^3 elements, rewrite n^3=m, then m elements will be
> stored in m^(1/3) heaps
> Interesting data structure! need to look over the details some more
> Titus

Yes, thanks! I just wrote about this on reddit (before having seen this). -- Andrei

December 01, 2015
On 12/01/2015 03:33 AM, Timon Gehr wrote:
> On 11/30/2015 09:57 PM, Andrei Alexandrescu wrote:
>> So now consider my square heaps. We have O(n) build time (just a bunch
>> of heapifications) and O(sqrt n) search.
>
> How do you build in O(n)? (The initial array is assumed to be completely
> unordered, afaict.)

(I meant to say: There aren't any assumptions on the initial ordering of the array elements.)
December 01, 2015
On 11/30/2015 06:32 PM, Titus Nicolae wrote:
> On Monday, 30 November 2015 at 23:19:44 UTC, Titus Nicolae wrote:
>> On Monday, 30 November 2015 at 20:13:15 UTC, Andrei Alexandrescu wrote:
>> ...
>>> smallest organized in a max heap, ... etc. There are a total of
>>> O(sqrt n) heaps, and each has O(sqrt n) elements.
>> ...
>>
>> Hi Andrei,
>> If I'm not mistaken, the number of heaps is proportional to n^(1/3)
>> not n^(1/2)
>> 1+2^2+..+n^2 is proportional to n^3
>> https://en.wikipedia.org/wiki/Square_pyramidal_number
>> If n heaps have n^3 elements, rewrite n^3=m, then m elements will be
>> stored in m^(1/3) heaps
>> Interesting data structure! need to look over the details some more
>> Titus
>
> This might give a better balanced structure.
>
> Sum of odd numbers 1+3+5+..+(2n-1) = n^2
> For N total elements, there are sqrt(n) heaps, and 2*sqrt(n) elements in
> the largest heap
> @Andrei: La multi ani! :)

This is very interesting. Thanks, and thanks! -- Andrei

December 01, 2015
On 11/30/15 9:47 PM, Timon Gehr wrote:
> On 12/01/2015 03:33 AM, Timon Gehr wrote:
>> On 11/30/2015 09:57 PM, Andrei Alexandrescu wrote:
>>> So now consider my square heaps. We have O(n) build time (just a bunch
>>> of heapifications) and O(sqrt n) search.
>>
>> How do you build in O(n)? (The initial array is assumed to be completely
>> unordered, afaict.)
>
> (I meant to say: There aren't any assumptions on the initial ordering of
> the array elements.)

That's quite challenging. (My O(n) estimate was off the cuff and possibly wrong.) Creating the structure entails simultaneously solving the selection problem (find the k smallest elements) for several values of k. I'll post here if I come up with something. -- Andrei

December 01, 2015
On Monday, 30 November 2015 at 20:13:15 UTC, Andrei Alexandrescu wrote:
> Okasaki's book is a continued inspiration of data structures and algorithms.
> Start with a simple array of data. Then mentally decompose that array into a concatenation of smaller arrays: first has size 1, second has size 4, third has size 9, then 16, 25, 36, ... Generally the size of these imaginary subarrays grows quadratically. And they're adjacent to each other. The last array may be incomplete.
> ........
> Please share any thoughts!
>
>
> Andrei

Interesting, but isn't that basically a skip list with uneven sub-lists? insert could be quite complex unless the entire data structure is backed by a continuous or a continuous linked list.
December 01, 2015
On Monday, 30 November 2015 at 20:13:15 UTC, Andrei Alexandrescu wrote:
> Okasaki's book is a continued inspiration of data structures and algorithms.
>
> [...]


Hi,

You might find relevant ideas in https://cs.uwaterloo.ca/research/tr/1979/CS-79-31.pdf and http://www.sciencedirect.com/science/article/pii/0022000080900379 .

Cheers,
Marko

December 01, 2015
On 12/1/15 5:19 AM, Marko Mikulicic wrote:
> On Monday, 30 November 2015 at 20:13:15 UTC, Andrei Alexandrescu wrote:
>> Okasaki's book is a continued inspiration of data structures and
>> algorithms.
>>
>> [...]
>
>
> Hi,
>
> You might find relevant ideas in
> https://cs.uwaterloo.ca/research/tr/1979/CS-79-31.pdf and
> http://www.sciencedirect.com/science/article/pii/0022000080900379 .

Noted, thanks! -- Andrei