Backup & Archiving



In the mist of services offered by Arka Service we are able to adopt different techniques developed to optimise backup procedures. Some of these include optimisations to treat Open File and Live Data Source, as well as compression, cryptography and duplicate removal.

BACK UP AND ARCHIVING is the process of copying and archiving data so it may be used to restore, the original after a date loss event.
The services offered by Arka Service embrace two separate goals, the first has the primary objective of recovering lost data, may it be caused by deletion or data corruption. The second has the purpose of recovering data from a previous moment in time, following a conservation criteria defined by the user (retention policy), usually this is configured within backup applications.

All backup strategies start with the development of concepts based on data repositories, this should be achieved in steps. Arka Service accompanies clients through the development of these solutions, evaluating the necessity based on the nature of data to save.Speaking of which the organization could be as simple as a sheet of paper with the list of all backup supports (CD, etc…) and the date in which it was created. A more sophisticated solution could include a computerised index, catalogue or relational database. Of course different solutions have different advantages.

Below are some of the different repository models used by Arka Service:

  • UNSTRUCTURED(CD_RS or DVD_RS with minimalistic information and indications of what is contained and when it was saved)
  • FULL ONLY/SYSTEM IMAGE (contains the complete image of the system)
  • INCREMENTAL (aimed at making archiving more feasible by organising them according to a incremental approach. This allows to eliminate the necessity of saving duplicated copies of data)
  • DIFFERENTIAL (aimed at saving only data that has been modified since the last complete backup. This model has the advantage that only two data sets are required to recover data)
  • REVERSE DELTA (this model starts off with a standard complete backup. After having performed the task, the system periodically synchronises the complete backup with a live copy, al this while memorising the necessary data to reconstruct previous versions. This system works particularly well for big data sets that change with less frequency)