-
barney_
mc_ what's the best way to add another bigcouch to an existing cluster?
-
ruel
barney_: best way is to not do it, i.e. create a new cluster, move all data to it, then kill old cluster.
-
ruel
barney_: the reason is that while it's possible to add new nodes to an existing cluster, documents that have already been written previously will not re-propagate to the new node; only new documents will get spread over all the nodes including the new one. So although it should work fine, it puts you in a rather scary position that you may think you have the redundancy offered by the new node in addition to
-
ruel
your old ones, in reality, you only have partial redundancy from it. If enough of the old nodes go down, you will lose data, even though you might still have enough nodes up including the new node that you shouldn't.
-
mc_
barney_: ^ this is the way
-
mc_
couch does not currently support rebalancing dbs across nodes
-
mc_
though maybe 3.x might? haven't looked
-
mc_
-
mc_
shit, that's couchbase lololo
-
mc_
ignore
-
barney_
ok thanks guys
-
barney_
mc_ a quick erlang...using foldl tyring to build a list of binaries i'm doing -> [Acc ++ NewElement] - but getting [[[<<"a">>|<<"b">>]<<"c">>]]
-
mc_
Acc is a list that you're wrapping in a new list?
-
barney_
hmm maybe i mean -> Acc ++ [NewElement]
-
mc_
typically you would foldl and return [NewElement | Acc]
-
barney_
oh ok
-
mc_
then lists:reverse/1 the result if you need it in diff order
-
mc_
lists:foldl(fun(El, Acc) -> [El | Acc] end, [], [a, b, c]) would give you [c, b, a]
-
mc_
but lists:reverse/1 is "fast" (its a c code bit inside the lists module)
-
mc_
so better to build a list in reverse order like this (since prepending items to a list is super fast
-
barney_
ok thanks
-
barney_
Numbers = lists:foldl(fun number_lookup_fold/2, [], Matches),
-
barney_
number_lookup_fold(Result, Acc) -> [kz_json:get_ne_binary_value(<<"value">>, Result) | Acc].
-
barney_
would this work too? Numbers = [kz_json:get_ne_binary_value(<<"value">>, Match) || Match <- Matches,
-
mc_
yes, list comprehensions are the way to go
-
barney_
ok great . is it more efficient ?
-
mc_
LCs were faster in older versions, the gap is narrowing
-
mc_
but as long as the left side of || is small, i prefer them
-
barney_
LCs aren't that comprehendable :-D
-
barney_
makes my head implode.. lol
-
mc_
they're sugar for lists:map/2 basically
-
mc_
but you can do filtering and multiple generators
-
mc_
the hidden trick is if you pattern match in the generator, any elements that don't match will be ignored silently
-
mc_
so [Foo || {ok, Foo} <- [{ok, a}, {ok, b}, {error, c}]] would be [a, b]
-
mc_
but lists:map(fun({ok, Foo}) -> Foo end, [{ok, a}, {ok, b}, {error, c}]) would crash with function_clause error
-
mc_
which can be useful or cause mayhem if you're not ready for it :)
-
barney_
yes that does seem to have merit
-
barney_
>> barney_: best way is to not do it, i.e. create a new cluster, move all data to it, then kill old cluster.
-
barney_
what does move all the data over involve?
-
barney_
ruel ^^