[progress Communities] [progress Openedge Abl] Forum Post: Re: Way To Store Really (really)...

  • Thread starter Thread starter Thierry Ciot
  • Start date Start date
Status
Not open for further replies.
T

Thierry Ciot

Guest
First of all I would say: verify you have a real performance issue before tackling what you want to do J . Assuming there is a real perf issue, what you are trying to do is a typical pattern of reducing dataset. It would help if you would describe your use case: · What you intend to use the reduced data set for? Displaying a graph, … · And what your performance goals are? · How long would the reduce operation take? (under 2mns/tens of mns/hours…) Anyway, let me suggest a few generic solutions: You can use a trigger (on modification of data): ( documentation.progress.com/.../ 1) To reduce your data and store it in one text field (with the limitations you outlined). And honestly I would be really surprised you would hit the 80k limit. If you do reach it, that may mean you haven’t reduced enough J . Again it depends on your use cases but let’s say you need to display a couple of graphs, then you should easily be able to fit into 80k (if not then you may have a usability issue where you just display too much data for your user in the first place). For details read documentation.progress.com/.../ 2) To reduce your data and create multiple object records in a “Cached Object” list. As an example, for a line graph, you would create one object record with x and y value. That solution won’t hit the 80k limit. Not sure if it will get you the perf you need. The advantage of this solution is that it is running on every record update thus your reduced DS is always up to date. Now, if your reduce operation is taking a long time, you may not want to do it in a trigger and as previous person suggested you could use documentation.progress.com/.../ But you won’t have the choice of running it on every record update Finally, if you are doing big data kind of thing, we don’t have support for map/reduce built-in but you could easily leverage a map/reduce engine as an external operation and use available API to store the data back into RB object as suggested above upon completion of the map/reduce operations. Hope this helps, Thierry.

Continue reading...
 
Status
Not open for further replies.
Back
Top