SOLR indexing performance with large amount of data -




i have indexed of 10 millions documents, per new requirement need add field in solr schema. question ideal approach? mean add field in schema , re-index whole data? or partial index? or else?.

the data has submitted regardless of do, answer depend on size of rest of fields. if small, add field , reindex everything.

if there's large field present, might more effective letting solr fetch content internally (i.e. partial update) instead of submitting on network, require have fields defined stored or doc values.

it's impossible exactly, you'll have perform small test small number of documents see how performance dataset.





wiki

Comments

Popular posts from this blog

Asterisk AGI Python Script to Dialplan does not work -

python - Read npy file directly from S3 StreamingBody -

kotlin - Out-projected type in generic interface prohibits the use of metod with generic parameter -