Jump to content

How Does Large Sets Of Data Can Be Efficiently Retrieved


ahimsavaadhi1

Recommended Posts

cassandra ..  big data 

 

maa company lo using ani thelusu .. no working knowledge on it ..  so intha kanna no knowledge as of now ..  

konchem adigi cheptara office lo... chala urgent

Link to comment
Share on other sites

  • Replies 48
  • Created
  • Last Reply

Top Posters In This Topic

  • ahimsavaadhi1

    14

  • chalkpiece

    5

  • xxxmen

    4

  • andhravodu

    4

Popular Days

Top Posters In This Topic

I have large set of files each one around 10mb to 500 Mb and I would like to store them in the Database and retrieve them whenever I need them.

I would like to build a web app to retrieve them with some features ... How can I do that ?

-1xsWM.gif

 

relation db lo kudaradhu.

you need to have a bucket/blob kinda storage..  for e.g. azure blob storage.. amazon s3 bucket kuda same anukunta not sure..  

Link to comment
Share on other sites

files anni oka folder lo pettei... use search on ... my computer


That's the reason this thread started man ...

Gitlab is closer to what I think but I'm not sure if gitlab deals with such large datasets
Link to comment
Share on other sites

one of my friend works for a gaming company in NY man... he was telling me, they will shoot some CG video in some format which makes it into some GB just for 15 seconds or so. then they have to do some transformation and convert it to a mpeg format... this transformation cannot be handled by normal computers or so... they use azure files and AWS for this purpose... just like pay for what you use antey man...

 

nee case lo you can MySQL and php, if purely webbased... 500 Mb ki more than enough man...

Link to comment
Share on other sites

one of my friend works for a gaming company in NY man... he was telling me, they will shoot some CG video in some format which makes it into some GB just for 15 seconds or so. then they have to do some transformation and convert it to a mpeg format... this transformation cannot be handled by normal computers or so... they use azure files and AWS for this purpose... just like pay for what you use antey man...

nee case lo you can MySQL and php, if purely webbased... 500 Mb ki more than enough man...


Each file around 10 mb to 500mb . And generating of those files is very frequently

My current work churns out almost 40gigs per month data

And we need to keep it mandatorily for 10 years .

And sometimes we need to retrieve that data ... Which is cumbersome ...
Link to comment
Share on other sites

So my plan is to keep atleast latest 3 months of data in the database that can handle and retrieve it using web app and present it for who need it. rest of the file names will be stored in a table and tagged and the files will be sent for archiving

Link to comment
Share on other sites

sorry man, data anukunna... aitey AWS or azure teesko man... cheap and best option...


My company has lot of infrastructure in place and they are not in to cloud yet.

And i may not be able convince my manager to go cloud route.
Link to comment
Share on other sites

My company has lot of infrastructure in place and they are not in to cloud yet.

And i may not be able convince my manager to go cloud route.

 

tell them it is the cost effective option and we can save much on storage if we do this way, so that they may incline towards this option rather than thinking about rackspace.... 2w2r5gm.jpg

Link to comment
Share on other sites

×
×
  • Create New...