How long do those requests take?
More than 30sec?
Do the cpu% go up during this period?
I'd suggest using a profiler, such as dotTrace or ANTS, on the server to really see what functions take long time to execúte.
/johan
Hi Johan
Each requests takes between 20 to 45 seconds, and this is very frustrating for our editors.
I will try your advise and report my findings.
Bets regards
/mats
Hi Johan
I followed the cpu trace and I noticed a heavier load on all four cpu:s on the web server, on the moment I started to go down the file manager hierarchy. I send you a mail with screen dumps if you would like to see.
The more files in a folder, the slower it goes. I think I can see that pattern.
And this also includes doalogues when uploading a new file to that folder.
Hi Mats!
Just about how many files/directories are there in the filesystem?
Are these versioned files with possibly specific access permissions set?
Would it be possible to get a profiler running on the server?
I think the outcome would really point out whats causing those delays.
/johan
Hi
Global folder: about 7800 files, 7764 folders
Page Files: 5859 files, 5857 folders
We do not use the versioning system, and there are no special permissions set on the files/folders exept for some folders.
We will try run a profiler on the server!
Best regards
Hi
I have to correct: we have versioning system turned on but we do not use it.
We started a trace on the Network interface >> Packets sent/sec and it peaks really high (hits the ceiling of the trace and last for a long time) when trying to open a folder in the filemanager with lots of files.
So, the server sends a lot of packets over the network. Is this due to:
- querying SQL Server?
- sending quiestions to AD for every folder/file? (We use ActiveDirectoryMambershipProvider)
Best regards
//Mats
Packets sent/sec increases from a normal 0-200 packets/sec up to above 2000 packets sent/sec when opening a certain folder with lots of files.
And the curve is peaking all the time (30-40 seconds) until tha web page is loaded.
We have now traced sql server, from the moment I load the file manager and then clicking down the folders 5 levels to a certain folder with many files.
And we found that for every file in a folder, calls are made to stored procedures like like the code below. And the calls are made for every parent folder up in the hierarchy.
So for every file in the sub folder on level five (5), 10 calls to stored procedures are made....
An SQL Server Trace file from opening the file manager to tha moment the folder on fourth level is rendered is of 80Mb size... There are lots of calls to stored procedures I say...
I thought EPI CMS did not do so many calls to the database??? This must be a very big performance issue?
declare @p1 nvarchar(100) set @p1=N'021b8279-cbcb-4c86-9952-84aee3bf1ec3' exec ItemFindByName @Id=@p1 output,@Name=N'IT_och_telefoni' select @p1 go exec ItemLoad @Id=N'021b8279-cbcb-4c86-9952-84aee3bf1ec3' go exec sp_reset_connection
Hi Mats!
That's the objectstore procedures you are see being called. The objectstore is used to store data for the versioned filesystem. The objectstore dataaccess is not cached, so a large number of files/folders would generate _a lot_ of calls to the database.
It's probably generally slightly better to build "deeper" rather than "wider" structures, but if you are not using the versioned store, I'd strongly recommend you to move to the native filesystem instead.
/johan
Thanks for the answer.
Is it easy to move to native file system from versioning? Is there a conversion module for that?
//Mats
We have expreienced a very slow file manager after upgrading to CMS.
Episerver handles ordinary web pages fast, including fast opening of the menu tree to the left. But when going down the file manager folders, it takes LOTS of time. Every dialog page in the file manager takes time. Opening folders, uploading a file, moving a file etc.
We have a lot of files in our upload directories, but it was no problem in our former version (4.31)