Backup Ambari Infra Solr

Back up all your ambari infra solr collections. Run the Solr APIs to backup the collections individually. You must create the backup directory on each Ambari Infra Solr node. However, the backup command takes the backup of shards only on the nodes where they reside.

The specified /path/to/backup/ directory must be created first on the host where the ambari-infra-solr is installed. The specified location needs to be accessible to the user by executing the command.
mkdir -p /path/to/backup/directory
chown -R infra-solr:hadoop /path/to/backup

For unsecured cluster

#Atlas infra-solr collections
curl -v 
"http://$INFRA_SOLR_URL/solr/vertex_index/replication?command=BACKUP&name=vertex_index_backup&location=/path/to/backup/directory"

curl -v 
"http://$INFRA_SOLR_URL/solr/edge_index/replication?command=BACKUP&name=edge_index_backup&location=/path/to/backup/directory"

curl -v 
"http://$INFRA_SOLR_URL/solr/fulltext_index/replication?command=BACKUP&name=fulltext_index_backup&location=/path/to/backup/directory"

#Ranger audit infra-solr collections
curl -v 
"http://$INFRA_SOLR_URL/solr/ranger_audits/replication?command=BACKUP&name=ranger_audits_backup&location=/path/to/backup/directory"

#Log Search infra-solr collections
curl -v 
"http://$INFRA_SOLR_URL/solr/hadoop_logs/replication?command=BACKUP&name=hadoop_logs_backup&location=/path/to/backup/directory"

curl -v 
"http://$INFRA_SOLR_URL/solr/audit_logs/replication?command=BACKUP&name=audit_logs_backup&location=/path/to/backup/directory"

curl -v 
"http://$INFRA_SOLR_URL/solr/history/replication?command=BACKUP&name=history_backup&location=/path/to/backup/directory"
For secured cluster
If the cluster is Kerberized, then you must kinit as the service principal.
#Atlas infra-solr collections
curl -v --negotiate -u: 
"http://$INFRA_SOLR_URL/solr/vertex_index/replication?command=BACKUP&name=vertex_index_backup&location=/path/to/backup/directory"

curl -v --negotiate -u: 
"http://$INFRA_SOLR_URL/solr/edge_index/replication?command=BACKUP&name=edge_index_backup&location=/path/to/backup/directory"

curl -v --negotiate -u: 
"http://$INFRA_SOLR_URL/solr/fulltext_index/replication?command=BACKUP&name=fulltext_index_backup&location=/path/to/backup/directory"

#Ranger audit infra-solr collections
curl -v --negotiate -u: 
"http://$INFRA_SOLR_URL/solr/ranger_audits/replication?command=BACKUP&name=ranger_audits_backup&location=/path/to/backup/directory"

#Log Search infra-solr collections
curl -v --negotiate -u: 
"http://$INFRA_SOLR_URL/solr/hadoop_logs/replication?command=BACKUP&name=hadoop_logs_backup&location=/path/to/backup/directory"

curl -v --negotiate -u: 
"http://$INFRA_SOLR_URL/solr/audit_logs/replication?command=BACKUP&name=audit_logs_backup&location=/path/to/backup/directory"

curl -v --negotiate -u: 
"http://$INFRA_SOLR_URL/solr/history/replication?command=BACKUP&name=history_backup&location=/path/to/backup/directory"

It is highly recommended to save the data folder for disaster recovery.

  1. Stop Ambari-Infra service in Ambari UI.
  2. Back-up services data to restore later. For Ambari-Infra on the node where the ambari-infra-solr is installed.
    tar -czf ambari-infra-backup.tar.gz /var/lib/ambari-infra-solr/data