Saturday, August 31, 2013

Snapshots. Easy Backup. Rapid Deployment.

Hi Everyone,

Time to discuss a nice option for cloud backup and rapid deployment. All of us dream about templates for implementation. It will make ERP closer to a box product. When you can buy software, apply a template, and that is it. One step towards it - Snapshots.

Think about snapshots like backups, that we can do for the whole system. And because our customizations and reports are stored in the database, we can include them into a snapshot as well.

For the template we have made few types of the snapshots to help partners implement the system quickly, by copying template installation to multiple clients. You can do your own!

Another important thing - snapshots can be saved as zip files, from the cloud server into your PC, like downloads. Alternatively they can be stored inside the database, as a blob files.

Lets create one.
Open System->Management->Companies. On the first tab you will see existing snapshots sitting in the database.

In order to create the snapshot - just press a button on top, Create Snapshot, a dialog will appear to give it a Name and choose the snapshot type.
Here we have to decide, we need it as a backup, with full data in it, or as a template for another customer site installation or like after we finished Proof of Concept stage, its good to copy to production all the setups and master files.

Take a note, transactions can be skipped!
Meaning no need to clean up the POC to turn it into production site !

Then there will be 2 options, Include Customizations and Prepare Data for Export.
Fist one is obvious, second one will make a zip file and store it inside the system. So you can download it later.

So here is the thing, you can just keep your snapshot within the database or you can download it and store on your PC/Store etc.

If zip file is ready to download, snapshot will show Prepared check box checked. Other wise you can still press button Prepare Snapshot.

Now if I click Export Snapshot, the file will be downloaded to my PC.
I can Upload file to the server by pressing Import Snapshot.

Then FINAL stage is restore the snapshot. You can do it by pressing Restore Snapshot button and it will place data from snapshot file into database tables.

Here is a small video related to snapshot creation, download, upload and restoration.

All the best,


Changing Login Picture.

Hi Guys,

Mike inspired me today with a good idea for a blog.
At the last day of summer, I would like to re-brand Acumatica Cloud ERP login picture.
From what it looks now:

You already know that with version 4.1 login frame auto refreshes every time.
We have pool of seven predefined pictures. They are kept here:

At present, the only option is to replace these images with what we want:

And here is the result:

The only problem is, every time we upgrade Acumatica, system will overwrite these pictures, so for the time being, have to copy them over from backup location. In the future we will make them stay.

All the best,


Wednesday, August 28, 2013

Linking Sales Order to Purchase. Stock Allocation.

Hi Everyone,

One of the questions from our partners was - why don't we have a topic at Acumatica University on how to tie Sales Order to Purchase Order, then break this link, then see the goods allocation in the inventory.

Basic business process behind is when we sell something we may wish to buy goods for this specific sales order. And then, at some point we may reconsider and dedicate goods for another Sales.

I am using Demo database to demonstrate it.

What I did first is switched off sub items, not to mess up the picture.
(Well, if you like to know how to switch off sub items on demo database, please let me know, it was done not from UI.)

Step One. Create an Item in Stock Items screen.

Nothing special. Using version 4.1

After Item created I made a receipt in Inventory for 1000 pieces.

Step Two. Make a Sales Order.

Here I used the above item indicating a line should be Ship Complete. Added some quantity and price. Take note of Mark for PO check box. Will use it in the next step.

Step Three. Link SO to Existing PO.

To achieve it, will mark check box Mark for PO. Then Press PO Link button.
We may already have a PO, so we can link to existing one.

We should select the line on that screen then Save to mark that existing PO to the SO.

Step Three A. Create PO for SO on the fly.

Alternatively we may wish to create new PO for that Sales Order. Let's go to Actions->Create a PO. Then screen below comes out. Here we need to indicate the Vendor and Location and then click Process to create a PO. Once created, SO will get link to that PO automatically.

Here is the SO. Take note of the link.

Step Four. Break the link. Allocate to Another PO.

Let say we have reconsidered and wanted to reallocate the sales order to another vendor PO.
This is achieved at the Sales order. Unchecked Mark for PO on the line, save. It will automatically remove PO link. Now lets link to another vendor PO. Check Mark for PO again. Save.
Then call PO Link button. Now we can choose another vendor and PO.

Step Five. Verify Figures.

We can use either Inventory Summary to see current inventory positions

Or Allocation Details

Step Six. Receive PO and see what Happened.

Here is the receipt

And Sales Order can be now shipped. I created shipper document then confirmed it. Standard process.

Question why do we swap POs for a single SO - could be if we want to manage sales better, let say we expect delivery of the same car parts faster from vendor A than vendor B, and have customer RIGHT IN FRONT waiting... We may wish to consider reallocation POs... ;)

All the Best,


Sunday, August 25, 2013

Translating Acumatica. Changing Field Labels. Regional Settings.

Hi Everyone,

A task to translate the user interface is quite common in Asia, with it's multiple business cultures and languages spoken.

It can easily be done via translation tool inside Acumatica Cloud ERP. No Customization is necessary.
We can use the same approach when just a slight amendment of field names or grid labels is needed.

So first what we have to do - is to define a language.

Open System->Management->System Locales. Add new system Locale, well it can be a brand new language or the same English language, may be different region, for which we like to change the terminology. In my example English Singapore was created.

Once I made it active, system will include additional language into a login screen.

Then we can change default Date / Time format etc., basically all Cultural changes that we have in our computer we can impose on Acumatica. So when people from country A use it the have say DAY-MONTH-YEAR format but at the same time, user in country B will have YEAR-MONTH-DAY for example.

Now it is time to do the Translation. Please navigate to System->Management->Translation Dictionaries and choose language that we want to modify.

System displays all the labels at the grid below. For each label it lists all the places where it was used , take a look at the lowest portion of the screen. What we can do with it is to try the filter to narrow the list down to only specific labels or use the quick search field and then enter target value, which is translated field into the right column of the grid.

So lets try to translate the field Batch Number that is used at the General Ledger Journal Transactions screen. I would like to change it to "Document Sequence".

Entered Batch Number into a search bar, then highlighted the row with the Batch Number label, then added value to Target to be Document Sequence. Then Saved.
Line disappears right after save. Its because we have not selected Show Localized check box on top.

In order for changes to get effective, we have to re-login to the system and choose different language on login page. Right after it, modification is in effect.

If needed to translate all labels in one shot, it is possible by exporting labels into Excel file, then using professional translator (or just Google :) ) to change the labels, then uploading file back to the system.
Lets export labels fist.

Then save file locally then translate all the content to say Chinese simplified.

And now, we are just uploading this file back to Acumatica.

Save it and we are done. Re-login.

All the best,


Wednesday, August 21, 2013

Application Pool Settings. Make Sure its 64 bit.

Hi Guys,

Just wanted to share application pool settings, that we should set in order to allocate memory better by WEB server.

Assuming we use 64 bit machine to host our IIS server, we may still choose 32 bit or 64 bit compatibility for Application Pool.

32 bit mode allows us to debug the process, therefore we could forget to switch it back to 64 bit mode after debug process it complete and we go for production.

How can we identify that Web Site is using 32 bit mode? Just look at the task manager:

As long as you see x32 at the end of the IIS Worker process name - it runs in 32 bit mode.

Form IIS you can find this option under Application Pool settings

What may happen on production environment when it runs on 32 bit mode, system may consume all memory, and due to 32 bit mode it will not be able to scale it above 2G/4G. And what will happen next:

In order to prevent it from happening, we should switch Application Pool BACK to 64 bit mode:

All the best,


Acumatica 4.1 General Availability Release. Get it NOW !

Hi Everyone,

We just published download links on our portal for Acumatica 4.1
Please try this amazing release.

Cool features available now:

1. Multiple files upload, drag and drop.
2. Filters on the grids per column.
3. And YES, Customer Portal !
4. Inquiries are now exporting DYNAMIC Excel. So, forget about reports :) Create Excel, pivot etc.

Try it, it is awesome.

All the best,


P.S. Somehow I realised I spend most of my time sitting at the login page and clicking nice....

Saturday, August 10, 2013

Choosing the right Edition. Number of CPU cores demystified.

Hi Everyone,

Sure you've seen this story.

Prospect: How to convince my boss (CIO etc.) that we are buying the right edition of Acumatica Cloud ERP?
Sales guy: Errr, mmmm, well... you should trust me...

Prospect: Ok, just explain why do we need 4 CPU not 2? We only have 3 companies, we can get 2 CPU, right? Why do you offer us 4 CPU license????
Sales guy: Errr, mmmm, well... where's the hell is the tech guy...?

So, here is a helping hand to our sales team.

What I did - took one of our big customers and analyzed the database transactions, users and payload.
It is a Departmental Edition with 2 CPUs for WEB server, 2CPU for SQL server, 4GB shared RAM. Sitting on a VERY FAST Hyper-V virtual machine (that is sharing SQL) and supplied with SSD hard drives. So we can say these CPU are ideal CPU for calculation of the capacity limits. :)

If you are not interested in technical part, just scroll till the big font area, then later just copy paste pictures to your PPTs when needed, especially the skyscrapers one.

The rest of the team can stay ;)

First of all, what WEB server CPU does at Acumatica  - it processes Business Logic.

If it does not ring the bell... CPU is processing what you have entered into the screen, then validating that input, calculating some formulas and finally sending data back to your browser or to the SQL database.

Analogue, back to 60th, CPU is - the engine in your electrical "typewriting machine", and if it runs slow, even when you typeveryfast it willnotprocess the data, but will either queue them and type after you finished pressing buttons (good typewrites) or simply jam the letters :) (lousy typo). Well, of course Acumatica is a good one, it never jams! :)

Back to the test, here is the daily transactional load for the past two years.

I can see clear peeks at some days up to 50,000 transaction lines per day. While the other days it can be less that 100. So our CPU must be able to withstand these peaks. 50,000 per day could be over 8 hours equally or just in 1 minute :). SO these aggregated data are nice looking but can't determine what CPU actually does and how powerful it should be.

Lets take a closer look at transactions per minute, to see actual peaks.

Data became more realistic, at the day highlighted in red we had some peaks when users entered 6000 to 8000 transaction lines per minute. Lets take a closer look at that specific day.

And now lets put the first peak under the microscope.

Here we can see that Acumatica processes data with 5,000 records per minute rate, placing 5000+5000+4000+1000 into the queue, within 3 minutes. So the actual processing rate would be something around 15,000 transaction lines per 3 minutes, or about 5,000 transaction lines per minute, when data supplied in Large chunks.

At the same time, we can see that at 9:48PM system processed 8,000 records at peak. At that time amount of data was smaller or, may be CPU was less loaded with other non transactional tasks.

Anyway, for us it is a very good indication, Acumatica can process 5,000 to 8,000 transaction lines (rows) per minute. For 2 CPUs it will be 2,500 to 4,000 transaction lines per CPU per minute. Here we can break it down to per second and will get gold:

One CPU can process 40 to 65 transaction lines per second at max.

Of course I assume this is a dedicated CPU core.
It can also be Virtual, but Server will EAT IT UP FULL om-nom at such peak loads. So it will become Virtual ate Real. :)

One of the examples could be Distribution based company processing large volume of Sales every day. So what it means to this company: if sales order is around 20 lines each, then:

Single CPU can process 2 - 3 Sales order per second. Max. 

Lets now think about users prospective, how many users can work in the system.
Answer is - This really - DOES NOT MATTER :) The only thing that matters is how many transactions these users do.

But, anyway, going to my example, here is the number of NAMED USERS worked with Acumatica during company life. We do not license by those, but I was just interested how many users were creating such a traffic jam :)

Well we can see that at peaks there where 40-45 physical persons in the office :) using Acumatica. Very good. We can clearly see the weekends :). Notice Implementation stage :). And then Go-live and after go-live. As well as steady operations part later. Nice... Looks like a comb...

Another metric we can use. Number or Operations. Which is, log in, open screen or report. I am not talking about operations within the screen which can be treated as transaction, but, well, errr, mmmm. :)

What I see here - is how difficult was 2012 Year Closing process :) And when team went on holiday after it ;) Look at it in details, see how difficult it was. Below are Operations Per Day.

Some days were 6,000 screen openings and logins PER DAY. By just 30 people. 200 screens were opened by a SINGLE person per day in average. Think about it...

Conclusion is:

Number of Users DOES NOT MATTER AT ALL, the only thing that is important NUMBER OF TRANSACTIONS at its PEAK.              PER SECOND.

Single CPU can process 40 to 65 TRANSACTION LINES per second.

Single CPU can process 2-3 Sales Order 20 lines each per second. 

For distribution based business, translated to Editions it will be:
Departmental = 4-6 SO per second,
Divisional 8-12 SO per second,
Enterprise 16-24 SO per second.

For other industries just take peak load, calculate number of transaction lines per second, compare with what mentioned above.

Well, all this in assumption that you will not design other CPU hungry logic on reports or BI.
With more data available for analysis from our customers I will be able to estimate the impact of it as well.



Tuesday, August 6, 2013

Multi Company (Branch, Ledger) Consolidation

Hi Everyone,

Let’s explore Multi Company consolidation.
Well, even it is called Multi Company; in fact we can do it not only between different entities, but also branches or different ledgers of the same company as well.

What will happen, when consolidation is run?
1.       System will login into the source entity. Of course, it means we will somehow provide the password and user name for the consolidation process to do it. System will remember credentials, and then, when we call up the process - it will login into Source Company and…
2.       System will take data from the source company General Ledger, specifically from Account History table, or say, system will take Account/Subaccount balances or Trial Balance if you like and…
3.       System will place these balances INTO a GL batch on the Destination Company.

Well, that is all, now we just need to release this batch and will get destination company updated with the account/subaccount figures.

Q: So what actually consolidation is? A: Simple copying of the balances from place A to place B by means of creating a batch at the destination.

Q: When do we need consolidation? A: Anytime when you need to copy balances from company A to some other place B. Which could be reporting ledger within company A or another company or completely different entity.

Q: Where consolidation is run? A: It always run AT THE DESTINATION place. So, in other terms, it is a PULL process. Data PULLED by destination entity from the passive source entity.

Now, when basics are covered let’s perform a simple consolidation. For the sake of simplicity, I will create a destination Reporting Ledger within the SAME company and then copy data from ACTUAL accounting ledger into CONSOLIDATION reporting ledger.

Please note, we have to follow setup steps exactly in order to run consolidation successfully.


We need to login to a source company and on the Ledgers screen in GL, mark it as a Destination Source.

STEP 2. Create Destination Ledger.

I will create CONSOL reporting ledger in the same company. You can try it on another company or another entity. It will work CROSS entity as well. (That is the beauty of it :))

STEP 3. Configure Consolidation Template.

Using Consolidation screen we should configure consolidation process. Here we need to follow the steps exactly.
A.      Login to DESTINATION company
B.      Open Consolidation screen
C.      Enter Destination LEDGER
D.      Give a name to your Consolidation
E.       Provide Username to login into SOURCE company
F.       Provide Password to login to SOURCE company
G.     Press save button

You should see it like on the screen above.

A.      Now press button Synchronize All
B.      Click refresh button
C.      Click it ONE MORE TIME J
D.      Now press on the magnifying glass at the SOURCE LEDGER.
E.       Choose Source ledger. It will be the place where the data are coming from.
F.       Choose Period Start and Period To for the data to pull.

We should have something like above. Now can save it.

STEP 4. Configure Subaccount on the source.

Because of the fact that consolidation can be a complicated exercise, when we match multiple segments of the subaccount with each other, we have to configure the matching rule on the source company. In my example I will use simplest scenario with a single subaccount segment. Leaving to the reader to try more complex cases.

In my case I have to enter Consolidation Order = 1 and Number of Characters = 2.

STEP 5. Configure Account matching.

Technically we may have DIFFERENT chart of accounts on the source and destination companies. Therefore we should configure at the SOURCE, what will be the destination Account for each of the source accounts. In my simplest scenario, it will be one to one matching, so all we need to do, is to do provide SAME account as original one for each account we consolidate.

That is before, and this is AFTER


All I did is exported content into an excel file, then added Consolidation Account column, then imported back to the screen J. Saved.


STEP 6. Performing data PULL.

Now final step, let’s go to Import Consolidation Data screen and pull the data from the source into our desired place.

Result would be

At the end we should get batches in GL with all the balances sitting there, all we need is to release it.

One important thing about consolidation is - if our fiscal period is closed we should allow posting to closed periods in GL Preferences screen.

All the best,