Jump to content

disadvantages in getting married to green card holder


vendettaa

Recommended Posts

5 minutes ago, mantis said:

Ipudu nuvvu unna visa situation lo better marry H1....after GC it takes 5 years to get naturalization meanwhile you can’t be as his dependent or wait until he gets citizenship and then marry

lite na h1 nade , na h1 kastalu nenu padta , nakem priority ledu h1/gc ani 

Link to comment
Share on other sites

10 minutes ago, Android_Halwa said:

Time degara padindi...OPT kuda aipoinaka vastundi, Cap-Gap punyama ani October varaku time migilindi, vunna udyogam udetattu vundi, adi kuda pothe iga H1b approve ayinatte....

adento, itlanti situation lo vunapudu anni GC and Citizen sambandhalu ae vastuntayi....what a coincidence I say..!!! 

aunty ki luck undi bhayya

  • Haha 1
Link to comment
Share on other sites

2 minutes ago, ekunadam_enkanna said:

aunty ki luck undi bhayya

hi uncle please aa  mental howle comments quote cheyaku , i dont do that but still that kukka is after me 

vadi yedupu ki limit undadu such a loser he seems to be

Link to comment
Share on other sites

4 minutes ago, perugu_vada said:

aunty roju ni pelli gola endhi Comedy Kings - Brahmanandam & Ntr Top Comedy Scene In Andhrawala Andhrawala Scenes, Andhrawala Telugu Movie Scenes, Andhrawala Comedy Scenes, Andhrawala Action Scenes, Andhrawala Movie, Andhrawala Full Movie, Andhrawala Movie Parts, Andhrawala Full Length Telugu Movie, Andhrawala Movie Scenes, Jr. NTR Movies, Andhrawala Songs, Andhrawala Jr. NTR Movie, 2014 Telugu Movies, Telugu Old Movies GIF

Peddodiga nee blessings and tips ivvachu ga

Link to comment
Share on other sites

Just now, LastManStanding said:

Peddodiga nee blessings and tips ivvachu ga

hii bro 

enti friday evng plans

vadina nuvu intlone na

Link to comment
Share on other sites

58 minutes ago, LastManStanding said:

Enduku untai ? GC holder ni cheskunte niku ventane GC vastundi.

Also GC holders might prefer someone who is living here because if they marry someone in India..they can’t bring here for 5 yesrs

Baa GC holder ni cheskunte GC ? Only US Pourudu aithene GC anukunta kada $t@rw@r

Link to comment
Share on other sites

12 minutes ago, vendettaa said:

hive ikda processing ki use chestunnaru , mongodb as target , u have to add mongodb jars to hive and  start mongodb shell , in mongodb shell u can create external table setting properties ,1.8_create_external_table_Academp_mongo.

 

source https://acadgild.com/blog/how-to-export-data-from-hive-to-mongodb

 

inkoti hive is not a  relational database ,it is a data warehouse which is built on top of Apache Hadoop for providing data summarization, query and, analysis,It differs from a relational database in a way that it stores schema in a database and processed data into hdfs

 

1.9_insert_into_table_Academp_mongo_from

I would use spark for complete process . I didnt use that old school method for writing into mongodb

i would use scala-spark to build pipeline and deploy job on spark-hadoop cluster

Not making any sense. Hive processing ki use cheyydam enti? It’s a data store. When I said DBMS I didn’t say RDBMS. Which is a structured formatted Structure but Hive isn’t andhuke I said DBMS. Spark will do the translation between Hive Data Store and move the data to MongoDB ante i agree. Spark lo you can make sure connection strings are configured when making communication to each DBMS so that it will initiate the instance and do the translation. But first post lo you said I open Mongo DB shell and insert Hive table annav. That’s why I was confused. 

Link to comment
Share on other sites

27 minutes ago, tacobell fan said:

Not making any sense. Hive processing ki use cheyydam enti? It’s a data store. When I said DBMS I didn’t say RDBMS. Which is a structured formatted Structure but Hive isn’t andhuke I said DBMS. Spark will do the translation between Hive Data Store and move the data to MongoDB ante i agree. Spark lo you can make sure connection strings are configured when making communication to each DBMS so that it will initiate the instance and do the translation. But first post lo you said I open Mongo DB shell and insert Hive table annav. That’s why I was confused. 

Ya hive processing ki use chestaru and also for saving data  , logic hive lone ga rastunnam ade hql. Ikda data store ane concept teste chala discuss cheyalsostadi . I was talking about processing data on hadoop through hive . if the data is in hadoop, how do u process hdfs data .

Again comes concept of external table and processing in hive.or else without hive also we can directly communicate with hdfs and process the data . Hive uses simple sql to process data in hdfs.

 

na prev posts chudu  method 1 , method 2 ani rasa chadavandi

first approach:spark does everything whatever u said that is one approach.dataframe to write or read data from hive and even dataframe can be saved to whatever nosql db using connection strings.

second approach is without spark ,  so logic /processing adanta hive lo rastunnam, will be using hive and sqoop to do the etl process .

Link to comment
Share on other sites

22 minutes ago, Sucker said:

Baa GC holder ni cheskunte GC ? Only US Pourudu aithene GC anukunta kada $t@rw@r

Avuna? Sorry bro aite...pappu la kalu vesa

Link to comment
Share on other sites

22 minutes ago, tacobell fan said:

Not making any sense. Hive processing ki use cheyydam enti? It’s a data store. When I said DBMS I didn’t say RDBMS. Which is a structured formatted Structure but Hive isn’t andhuke I said DBMS. Spark will do the translation between Hive Data Store and move the data to MongoDB ante i agree. Spark lo you can make sure connection strings are configured when making communication to each DBMS so that it will initiate the instance and do the translation. But first post lo you said I open Mongo DB shell and insert Hive table annav. That’s why I was confused. 

 

2 minutes ago, vendettaa said:

Ya hive processing ki use chestaru , logic hive lonega rastunnam.

na prev posts chudu  method 1 , method 2 ani rasa chadavandi

first approach:spark does everything whatever u said that is one approach.dataframe to write or read data from hive and even dataframe can be saved to whatever nosql db.

second approach is without spark ,  so logic /processing adanta hive lo rastunnam, will be using hive and sqoop to do the etl process .

wat is this chaitanya and narayana college like after hours tutorials antunna @LastManStanding uncle

Link to comment
Share on other sites

1 hour ago, vendettaa said:

naku enduko green card kanna h1 ayte better anpistundi 

green card holder ni pelli cheskunte em benefits vastay spouse ki 

immigration status chudatledu nenu but greencard matches vastunay so doubts vastunay

but just want to know enti advantages and disadvantages

same as first one untaru 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...