Calling all developers! We invite you to provide your input on Feature Experimentation by completing this brief survey.
Calling all developers! We invite you to provide your input on Feature Experimentation by completing this brief survey.
I would really recommend to reconsider "serving" a IO stream from the disk directly to repsonse for every request. You should consider some sort of caching layer in between. ImageResizer (not promotion - don't have any relations with that company) library could be one of those.
Are you able to use Assets subsystem? The only downside of Assets as far as I remeber is fact that most of the stuff needs be done in code, and UI is for exploring only :) They have pretty extensible provider configuration system which provides you with set of built-in providers which one of them is Amazon S3 if you are targeting high-scale site.
I´ve implemented a metafield of type File and would like to provide a link to uploaded files for a product.
For images (which got it´s own metafield ImageFile) you can easily access the image and get url via following code for example:
Entry.ItemAttributes.Images.Image[0].Url
But the issue with the file metafield is that it seems it´s stored in database instead of saved to disk. Following code gets me the byte array:
Entry.ItemAttributes.Files.File[0].FileContents where FileContents is a bytearray.
So my question is what is best practice for handling File metafields? My solution that im thinking of is to create a aspx page where i stream file to disk and then response.write. The aspx file will take EntryID and i.e an GUID as a unique identifier so my url will be like:
http://mypage.com/mypage.aspx?entryid=50&guid=324234
Is there a more convinient solution? Use generic handlers instead?